New Study Suggests That YouTube's Recommendation Algorithm Isn't The Tool Of Radicalization Many People Believe (At Least Not Any More)
from the well,-look-at-that dept
It's become almost "common knowledge" that various social media recommendation engines "lead to radicalization." Just recently in giving a talk to telecom execs, I was told, point blank, that social media was clearly evil and clearly driving people into radicalization because "that's how you sell more ads" and that nothing I could say could convince them otherwise. Thankfully, though, there's a new study that throws some cold water on those claims, by showing that YouTube's algorithm -- at least in late 2019 -- appears to be doing the opposite.
To the contrary, these data suggest that YouTube's recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content. Instead, the algorithm is shown to favor mainstream media and cable news content over independent YouTube channels....
Indeed, as you read through the report, it suggests that YouTube's algorithm if it has any bias at all, it's one towards bland centrism.
The recommendations algorithm advantages several groups to a significant extent. For example, we can see that when one watches a video that belongs to the Partisan Left category, the algorithm will present an estimated 3.4M impressions to the Center/Left MSM category more than it does the other way. On the contrary, we can see that the channels that suffer the most substantial disadvantages are again channels that fall outside mainstream media. Both right-wing and left-wing YouTuber channels are disadvantaged, with White Identitarian and Conspiracy channels being the least advantaged by the algorithm. For viewers of conspiracy channel videos, there are 5.5 million more recommendations to Partisan Right videos than vice versa.
We should also note that right-wing videos are not the only disadvantaged groups. Channels discussing topics such as social justice or socialist view are disadvantaged by the recommendations algorithm as well. The common feature of disadvantages channels is that their content creators are seldom broadcasting networks or mainstream journals. These channels are independent content creators.
Basically, YouTube is pushing people towards mainstream media sources. Whether or not you think that's a good thing is up to you. But at the very least, it doesn't appear to default to extremism as many people note. Of course, that doesn't mean that it's that way for everyone. Indeed, there are some people criticizing this study because it only studies non-logged in user recommendations. Nor does it mean that it wasn't like that in the past. This study was done recently, and it's been said that YouTube has been trying to adjust its algorithms quite a bit over the past few years in response to some of these criticisms.
However, this actually highlights some key points. Given enough public outcry, the big social media platforms have taken claims of "promoting extremism" seriously, and have taken efforts to deal with it (though, I'll also make a side prediction that some aggrieved conspiracy theorists will try to use this as evidence of "anti-conservative bias" despite it not showing that at all). Companies are still figuring much of this stuff out and insisting that because of some anecdotes of radicalization that it must always be so, is obviously jumping the gun quite a bit.
In a separate Medium blog post by one of the authors of the paper, Mark Ledwich, it's noted that the "these algorithms are radicalizing everyone" narrative also is grossly insulting to people's ability to think for themselves:
Penn State political scientists Joseph Philips and Kevin Munger describe this as the “Zombie Bite” model of YouTube radicalization, which treats users who watch radical content as “infected,” and that this infection spreads. As they see it, the only reason this theory has any weight is that “it implies an obvious policy solution, one which is flattering to the journalists and academics studying the phenomenon.” Rather than look for faults in the algorithm, Philips and Munger propose a “supply and demand” model of YouTube radicalization. If there is a demand for radical right-wing or left-wing content, the demand will be met with supply, regardless of what the algorithm suggests. YouTube, with its low barrier to entry and reliance on video, provides radical political communities with the perfect platform to meet a pre-existing demand.
Writers in old media frequently misrepresent YouTube’s algorithm and fail to acknowledge that recommendations are only one of many factors determining what people watch and how they wrestle with the new information they consume.
Is it true that some people may have had their views changed over time by watching a bunch of gradually more extreme videos? Sure. How many people did that actually happen to? We have little evidence to show that it's a lot. And, now, there is some real evidence suggesting that YouTube is less and less likely to push people in that direction if they're among those who might be susceptible to such a thing in the first place.
For what it's worth, the authors of the study have also created an interesting site, Recfluence.net where you can explore the recommendation path of various types of YouTube videos.
Filed Under: algorithms, engagement, radicalization, recommendation algorithm, recommendations
Companies: youtube