Facebook Ranking News Sources By Trust Is A Bad Idea... But No One At Facebook Will Read Our Untrustworthy Analysis

from the you-guys-are-doing-it-wrong-again dept

At some point I need to write a bigger piece on these kinds of things, though I've mentioned it here and there over the past couple of years. For all the complaints about how "bad stuff" is appearing on the big platforms (mainly: Facebook, YouTube, and Twitter), it's depressing how many people think the answer is "well, those platforms should stop the bad stuff." As we've discussed, this is problematic on multiple levels. First, handing over the "content policing" function to these platforms is, well, probably not such a good idea. Historically they've been really bad at it, and there's little reason to think they're going to get any better no matter how much money they throw at artificial intelligence or how many people they hire to moderate content. Second, it requires some sort of objective reality for what's "bad stuff." And that's impossible. One person's bad stuff is another person's good stuff. And almost any decision is going to get criticized by someone or another. It's why suddenly a bunch of foolish people are falsely claiming that these platforms are required by law to be "neutral." (They're not).

But, as more and more pressure is put on these platforms, eventually they feel they have little choice to do something... and inevitably, they try to step up their content policing. The latest, as you may have heard, is that Facebook has started to rank news organizations by trust.

Facebook CEO Mark Zuckerberg said Tuesday that the company has already begun to implement a system that ranks news organizations based on trustworthiness, and promotes or suppresses its content based on that metric.

Zuckerberg said the company has gathered data on how consumers perceive news brands by asking them to identify whether they have heard of various publications and if they trust them.

“We put [that data] into the system, and it is acting as a boost or a suppression, and we’re going to dial up the intensity of that over time," he said. "We feel like we have a responsibility to further [break] down polarization and find common ground.”

But, as with the lack of an objective definition of "bad," you've got the same problem with "trust." For example, I sure don't trust "the system" that Zuckerberg mentions above to do a particularly good job of determining which news sources are trustworthy. And, again, trust is such a subjective concept, that lots of people inherently trust certain sources over others -- even when those sources have long histories of being full of crap. And given how much "trust" is actually driven by "confirmation bias" it's difficult to see how this solution from Facebook will do any good. Take, for example, (totally hypothetically), that Facebook determines that Infowars is untrustworthy. Many people may agree that a site famous for spreading conspiracy theories and pushing sketchy "supplements" that you need because of conspiracy theory x, y or z, is not particularly trustworthy. But, for those who do like Infowars, how are they likely to react to this kind of thing? They're not suddenly going to decide the NY Times and the Wall Street Journal are more trustworthy. They're going to see it as a conspiracy for Facebook to continue to suppress the truth.

Confirmation bias is a hell of a drug, and Facebook trying to push people in one direction is not going to go over well.

To reveal all of this, Zuckerberg apparently invited a bunch of news organizations to talk about it:

Zuckerberg met with a group of news media executives at the Rosewood Sand Hill hotel in Menlo Park after delivering his keynote speech at Facebook’s annual F8 developer conference Tuesday.

The meeting included representatives from BuzzFeed News, the Information, Quartz, the New York Times, CNN, the Wall Street Journal, NBC, Recode, Univision, Barron’s, the Daily Beast, the Economist, HuffPost, Insider, the Atlantic, the New York Post, and others.

We weren't invited. Does that mean Facebook doesn't view us as trustworthy? I guess so. So it seems unlikely that he'll much care about what we have to say, but we'll say it anyway (though you probably won't be able to read this on Facebook):

Facebook: You're Doing It Wrong.

Facebook should never be the arbiter of truth, no matter how much people push it to be. Instead, it can and should be providing tools for its users to have more control. Let them create better filters. Let them apply their own "trust" metrics, or share trust metrics that others create. Or, as we've suggested on the privacy front, open up the system to let third parties come in and offer up their own trust rankings. Will that reinforce some echo chambers and filter bubbles? Perhaps. But that's not Facebook's fault -- it's part of the nature of human beings and confirmation bias.

Or, hey, Facebook could take a real leap forward and move away from being a centralized silo of information and truly disrupt its own setup -- pushing the information and data out to the edges, where the users could have more control over it themselves. And not in the simplistic manner of Facebook's other "big" announcement of the week about how it'll now let users opt-out of Facebook tracking them around the web (leaving out that they kinda needed to do this to deal with the GDPR in the EU). Opting out is one thing -- pushing the actual data control back to the end users and distributing it is something entirely different.

In the early days of the web, people set up their own websites, and had pretty much full control over the data and what was done there. It was much more distributed. Over time we've moved more and more to this silo model in which Facebook is the giant silo where everyone puts their content... and has to play by Facebook's rules. But with that came responsibility on Facebook's part for everything bad that anyone did on their platform. And, hey, let's face it, some people do bad stuff. The answer isn't to force Facebook to police all bad stuff, it should be to move back towards a system where information is more distributed, and we're not pressured into certain content because that same Facebook thinks it will lead to the most "engagement."

Push the content and the data out and focus on the thing that Facebook has always been better at at it's core: the connection function. Connect people, but don't control all of the content. Don't feel the need to police the content. Don't feel the need to decide who's trustworthy and who isn't. Be the protocol, not the platform, and open up the system so that anyone else can provide a trust overlay, and let those overlays compete. It would take Facebook out of the business of having to decide what's good and what's bad and would give end users much more control.

Facebook, of course, seems unlikely to do this. The value of the control is that it allows them to capture more of the money from the attention generated on their platform. But, really, if it doesn't want to keep dealing with these headaches, it seems like the only reasonable way forward.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: confirmation bias, fake news, mark zuckerberg, metrics, news, ranking, trust, trustworthy
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    hij (profile), 2 May 2018 @ 9:49am

    Everything old is new again

    Looks like Facebook learned nothing from AOL and their attempts to keep other businesses out. At least I do not have to figure out clever things to do with Facebook CDs. This does have the advantage, though, in that it will split European regulators between those that favour personal liberty and those who favour traditional media outlets.

    link to this | view in thread ]

  2. icon
    Ninja (profile), 2 May 2018 @ 10:06am

    How about don't remove anything unless it's clearly a crime and even then if it's a crime in a country and not in another then just filter the former out. And tell sensitive people they can go fuck themselves and just hide the thing they don't like?

    It's pretty simple, if it's a crime then hide it (or remove if it's one of those universal things like child porn) and go after the goddamn source. Otherwise just let people hide it and make their own 'feeds' rosy unicorn filled things.

    If countries like Germany want to push ethereal 'fake news' or 'hate speech' laws then filter the heck out of their content but let other more mature countries have the thing censorship free.

    Is it that hard?

    link to this | view in thread ]

  3. This comment has been flagged by the community. Click here to show it
    identicon
    Apple support, 2 May 2018 @ 10:44am

    After the debacle of Facebook data leaks news, everyone is scared to put anything on it and also do not feel safe or trust on Facebook's new policies.

    link to this | view in thread ]

  4. icon
    Richard (profile), 2 May 2018 @ 10:50am

    Human moderators

    Of course the solution that many people out there believe to be the best - human moderators - is actually the worst - because the people they use are going to be paid peanuts - and will probably do an even worse job than an algorithm.

    link to this | view in thread ]

  5. icon
    Richard (profile), 2 May 2018 @ 10:54am

    MSM Trust

    _ trust is such a subjective concept, that lots of people inherently trust certain sources over others even when those sources have long histories of being full of crap._

    Most MSM outlets have a reputation for being trustworthy - which survives right up to the point where they report on something where you have actual first hand knowledge about - then you realise how bad they actually are - and you never trust them again.

    It is not a matter of trusting some outlets - it is a matter of using the output of multiple sources, where you know at least roughly what their bias and expertise is - and then forming your own opinion.

    link to this | view in thread ]

  6. identicon
    Thad, 2 May 2018 @ 10:56am

    Re: Everything old is new again

    Looks like Facebook learned nothing from AOL

    You kidding me?

    Facebook learned everything from AOL.

    link to this | view in thread ]

  7. icon
    btr1701 (profile), 2 May 2018 @ 10:58am

    They Openly Admit It.

    "Facebook official Campbell Brown, a former anchor on NBC and CNN, told attendees at a recent technology and publishing conference that Facebook would be censoring news publishers based on its own internal biases."

    Pretty much says it all.

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 2 May 2018 @ 11:22am

    Re: They Openly Admit It.

    Googling the source of that "quote" certainly does say it all.

    link to this | view in thread ]

  9. icon
    Richard (profile), 2 May 2018 @ 11:40am

    Re: Re: They Openly Admit It.

    Googling the source of that "quote" certainly does say it all.

    Dismissing a story because of the messenger without bothering to check further says something about you.

    link to this | view in thread ]

  10. icon
    Richard (profile), 2 May 2018 @ 11:45am

    Re: Re: They Openly Admit It.

    In addition you would expect them to do exactly that - because that is what everyone (including the conservative media) does exactly that.

    More worrying is how Facebook is cowtowing to the most illiberal regimes on the planet:

    https://www.theguardian.com/technology/2017/jul/19/facebook-pakistan-blasphemy-laws-censorsh ip

    link to this | view in thread ]

  11. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 2 May 2018 @ 12:07pm

    Do you EVER, for even ten seconds, worry about GOOGLE?

    No, and there's a reason.

    [Repeated from next story after seeing this above:

    "the big platforms (mainly: Facebook, YouTube, and Twitter)"

    Astute enough to spot which GIANT "platform" is left out? As always here.]

    link to this | view in thread ]

  12. identicon
    Christenson, 2 May 2018 @ 12:12pm

    Context, context....

    In moderation, context is everything:

    Alex Jones posts something "dangerous" on infowars.

    Techdirt writes about Alex Jones calling him a "paid troll", and links to the original.

    What should stay "up"? Yes, some function has to decide what we see....but deleting bad stuff isn't it!

    link to this | view in thread ]

  13. identicon
    Anonymous Coward, 2 May 2018 @ 12:13pm

    Re: Do you EVER, for even ten seconds, worry about GOOGLE?

    I have news for You, YouTube is the main self publishing platform provided by Google, they also own Blogger. However the main search Engine is not a platform that allows people to self publish, it is just an index of the Internet, and indexes all the other self publishing platforms, Like Facebook, Twitter etc. as well as YouTube.

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 2 May 2018 @ 12:17pm

    "trustworthy" MSM can easily become war propaganda cheerleaders

    The "trustworthiness" of the press was a big argument that I was having with many people in 2003 (and consistently losing) as my country was preparing to invade Iraq for possession of "weapons of mass destruction" and "ties to Bin Ladin" and various other accusations. That's because all the so-called "respectable" and "trustworthy" news organizations such as the NY Times were pushing a unified narrative that basically ignored all the contradictory evidence, and in many cases, promoted outright lies as incontrovertible fact.

    There were many small American news outlets --including Infowars -- that were reporting a much more accurate analysis of the Iraq invasion than any of the "trusted" American mainstream media were reporting. While the well-worn line "if we only knew then what we know now" got thrown around an awful lot in the aftermath of not finding any of the claimed weapons that the invasion was supposed to justify, but the sad fact is that everything was already known "then" (before the invasion) and was all over the internet for anyone who cared to do their own research and make their own conclusions. It was primarily only the US mainstream media that got it wrong, while the rest of the world and the "conspiracy" minded websites were almost universally correct in their skepticism and their reporting of critical facts that the MSM refused to touch.

    Many foreign newspapers, including the UK's Independent and Guardian, dared to question the official war narrative, but not a single U.S. based news outlet dared stray from the official Bush-Cheney war propaganda. The American mainstream press had completely stopped being journalists and were instead acting as cheerleaders for a bogus war that ultimately claimed millions of lives and trillions of dollars.

    I can't believe 15 years have already passed, and I'm still literally seething with rage about the mainstream media leading this country over a cliff -- and then blaming it on everyone else but themselves.

    The 2003 Iraq invasion is the single biggest reason why I don't trust any of the American mainstream media news outlets -- and I'm sure I never will for as long as I live.

    link to this | view in thread ]

  15. identicon
    Christenson, 2 May 2018 @ 12:18pm

    Re: MSM Trust

    This "getting stuff all screwed up" isn't exactly unique to traditional "main stream media". It's a function of the people in the system.

    Not that I don't think there hasn't been a disconnect between what shows up in MSM and most people's direct experience. Tell me about the people in your life who have died, and compare that to what you see in the media. I know some that have wrecked cars and died, but this is not news. We'll lose a few hundred to fentanyl, but this is not on the news. 20,000 or so will shoot themselves and die this year, but this is not news.

    link to this | view in thread ]

  16. icon
    Mike Masnick (profile), 2 May 2018 @ 12:39pm

    Re: Do you EVER, for even ten seconds, worry about GOOGLE?

    Astute enough to spot which GIANT "platform" is left out? As always here.]

    Er, YouTube is Google. The reason I included YouTube rather than Google in general is that YouTube is Google's primary user generated content platform, and thus the main platform that people are demanding have their content moderated on...

    link to this | view in thread ]

  17. icon
    Richard (profile), 2 May 2018 @ 12:50pm

    Re: Re: MSM Trust

    This "getting stuff all screwed up" isn't exactly unique to traditional "main stream media". It's a function of the people in the system.

    True but what is unique to MSM is that some people DO trust them.

    link to this | view in thread ]

  18. icon
    Richard (profile), 2 May 2018 @ 12:53pm

    Re: "trustworthy" MSM can easily become war propaganda cheerleaders

    And it looks like something similar is in process right now with Iran/Russia.

    link to this | view in thread ]

  19. identicon
    SirWired, 2 May 2018 @ 12:54pm

    I have no problem with this

    Given a finite amount of data that is going to pop up in a user's feed, there are going to be SOME criteria over what makes the grade. Some internal measure over how "trustworthy" the source is seems as good a criteria as any.

    It's no different from a search engine ranking search results based on their best guess as to how "useful" the search result is instead of just counting the number of times the keywords show up on the page. Sometimes they'll guess well, sometimes they won't and useful results will be ranked lower than some pathetic spam.

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 2 May 2018 @ 1:10pm

    Re: Re: Re: They Openly Admit It.

    It's literally a fictional quote made up by a non-credible conspiracy blog.

    Whine all you want.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 2 May 2018 @ 1:29pm

    Two people read the same newspaper article. A cat person and a dog person.

    The dog person reads the article and rates it as accurate.

    The cat person reads the article and rates it as questionable.

    Six months go by and both revisit the article in light of new information.

    The dog person rates it as accurate and credible.

    The cat person rates it as debunked and not-credible.

    Therein lies the challenge with building out algorithms to automate this process. Let's say you assign a credibility score based on the historic human-based scoring of credibility by source. How frequently that source, or author, accurately reports the news. With as polarized as sources are today, the dog person and the cat person training an algorithm will come away with completely different automated processes reflecting their subjective world view.

    That is where we are today.

    We have automated algorithms scoring high credibility to sources that report the CrowdStrike analysis of DNC servers suggest it was done by Russians.

    We have Bill Binney and associated security researchers claiming what is known about the event has to be an insider from the DNC.

    Huge political stakes are at play so expect both sides of that coin to fight tooth and nail to force their will over control of the algorithms that do get deployed.

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 2 May 2018 @ 1:55pm

    Re: Re: "trustworthy" MSM can easily become war propaganda cheerleaders

    As Hearst and Pulitzer learned --and demonstrated-- way back in the 19th century: war sells newspapers. So it makes perfect sense that the press will always be on the side of starting wars, because it's in their own commercial interest. (and it's only much later when the public loses interest that they change tack.)

    It's interesting (but not unexpected) that even Trump's harshest critics in the press loudly applauded him when he bombed the Syrian government, both last year and this year. And what could be better for newspaper sales and cable news ratings than war against Russia?

    link to this | view in thread ]

  23. icon
    John85851 (profile), 2 May 2018 @ 1:55pm

    What about country bias?

    What I mean is that many (or most) people in the US blindly trust US media, for good or bad. Yet when you mention something like The Guardian or the BBC or Al-Jazeera, people claim those aren't US-sources so they can't be trusted.

    link to this | view in thread ]

  24. icon
    Richard (profile), 2 May 2018 @ 2:28pm

    Re:

    Therein lies the challenge with building out algorithms to automate this process. Let's say you assign a credibility score based on the historic human-based scoring of credibility by source. How frequently that source, or author, accurately reports the news. With as polarized as sources are today, the dog person and the cat person training an algorithm will come away with completely different automated processes reflecting their subjective world view.

    Actually it is far more complicated than that.

    There is no straight left - right axis anymore

    Foreign policy is even more complicated.

    Compare for example George Galloway's take on the world with that of Peter Hitchens.

    On the Israel vs Arab (Muslim) axis they would violently disagree

    On the Russia vs the West axis they agree (with each other) and both disagree with the western establishment view.

    The fact is that most people (and news outlets) are a mixture of different biases on different topics - and they may change with time.

    Generally I tend to trust the BBC and the Guardian - but on some issues I know that they have a bias and so I will look at other sources too.

    link to this | view in thread ]

  25. identicon
    Anonymous Coward, 2 May 2018 @ 2:29pm

    Re: Re: Everything old is new again

    There is a difference between learn and copy

    link to this | view in thread ]

  26. icon
    Richard (profile), 2 May 2018 @ 2:30pm

    Re:

    How frequently that source, or author, accurately reports the news.

    A much better criterion would be " how often do they report or give publicity to, alternative opinions.

    link to this | view in thread ]

  27. identicon
    Confirmation Bias, 2 May 2018 @ 2:32pm

    Re: MSM Trust

    I trust any source that regurgitates my particular point of view.

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 2 May 2018 @ 4:16pm

    Re: What about country bias?

    For some (not most as you claim) it is worse than that as they only believe infowars and brietfart while calling fox news libral

    link to this | view in thread ]

  29. identicon
    Anonymous Coward, 2 May 2018 @ 7:05pm

    Re: Do you EVER, for even ten seconds, worry about GOOGLE?

    Masnick has criticized Google in other articles, but your track record shows you don't like those either. Because reasons.

    The easier thing to do is to not give you what you want because giving you what you want is an endless chore akin to filling a bottomless pit...

    link to this | view in thread ]

  30. icon
    oliver (profile), 2 May 2018 @ 11:03pm

    Dear Mike
    That is not nearly enough what should happen to FB!
    Look at Cambridge Analytica, there the nuclear option has run it's course, and rightfully so.
    Therefore FB is next for extermination!
    Cheers, oliver

    link to this | view in thread ]

  31. identicon
    Yes, I know I'm commenting anonymously, 3 May 2018 @ 4:01am

    Trust for Sale

    It won't be long before fb starts selling `trust' (i.e. ranking points to get higher on the trusted list) to any comers..

    link to this | view in thread ]

  32. icon
    Ninja (profile), 3 May 2018 @ 6:07am

    Re: Re: Do you EVER, for even ten seconds, worry about GOOGLE?

    Too complex for his/her/its underdeveloped brain.

    link to this | view in thread ]

  33. identicon
    Thad, 3 May 2018 @ 9:35am

    Re:

    Look at Cambridge Analytica, there the nuclear option has run it's course, and rightfully so.

    If by "nuclear option" you mean "the founders ditched the company and started a new one", sure.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.