The Whole YouTube Radicalizes People Story Doesn't Seem To Have Much Evidence To Back It Up
from the myths,-myths-all-around-me dept
There seem to be a lot of "myths" about big internet companies that don't stand up to that much scrutiny, even as they're often accepted as common knowledge. There's the idea that Facebook's algorithm remains in place only because it makes Facebook more money (Facebook's own internal research suggests otherwise), or that disinformation goes viral on social media first (a detailed study showed cable news is a much bigger vector of virality).
Another big one is that YouTube "radicalizes" people via its algorithm. There are lots of stories about how someone went on to YouTube to watch, like, video game clips, and within a week had become an alt-right edge lord troll shouting Trump slogans or whatever. Hell, this was a key plot point in the Social Dilemma, in which the young boy in the fictionalized sitcom family starts watching some videos on his phone, and a week later is participating in an extremist political rally that turns into a riot.
However, a very thorough recent study (first highlighted by Ars Technica) found that there's really not much evidence to support any of this narrative. From the abstract:
Recently, YouTube’s scale has fueled concerns that YouTube users are being radicalized via a combination of biased recommendations and ostensibly apolitical “anti-woke” channels, both of which have been claimed to direct attention to radical political content. Here we test this hypothesis using a representative panel of more than 300,000 Americans and their individual-level browsing behavior, on and off YouTube, from January 2016 through December 2019. Using a labeled set of political news channels, we find that news consumption on YouTube is dominated by mainstream and largely centrist sources. Consumers of far-right content, while more engaged than average, represent a small and stable percentage of news consumers. However, consumption of “anti-woke” content, defined in terms of its opposition to progressive intellectual and political agendas, grew steadily in popularity and is correlated with consumption of far-right content off-platform. We find no evidence that engagement with far-right content is caused by YouTube recommendations systematically, nor do we find clear evidence that anti-woke channels serve as a gateway to the far right. Rather, consumption of political content on YouTube appears to reflect individual preferences that extend across the web as a whole.
Of course, this isn't the first study to find the same thing. A similar study that was released last year came to the same basic conclusion:
In conclusion, our study shows that one cannot proclaim that YouTube’s algorithm, at the current state, is leading users towards more radical content. There is clearly plenty of content on YouTube that one might view as radicalizing or inflammatory. However, the responsibility of that content is with the content creator and the consumers themselves. Shifting the responsibility for radicalization from users and content creators to YouTube is not supported by our data.
A study from two years ago... also found the same thing:
In short, the best quantitative evidence available demonstrates that any “radicalization” that occurs on YouTube happens according to the standard model of persuasion: people adopt new beliefs about the world by combining their prior beliefs with new information (Guess and Coppock, 2018). People select information about topics that interest them; if political, they prefer information that is at least some what congenial to their prior beliefs (Stroud, 2017). Persuasion happens at the margins when it does happen.
Indeed, that study showed that the classic story of someone watching a Trump-leaning "alt-lite" video and getting sucked down into alt-right extremism doesn't seem likely to happen that often.
A random walk algorithm beginning at an Alt-Lite video and taking 5 steps randomly selecting one of the ten recommended videos will only be recommended a video from the Alt-Right approximately one out every 1,700 trips. For a random walker beginning at a “control” video from the mainstream media, the probability is so small that it is difficult to see on the graph, but it is certainly no more common than one out of every 10,000 trips.
And, not that you would necessarily trust research coming directly from YouTube itself, but the company recently released some information on this question as well. Contrary to the "common knowledge" the company hasn't seen "more engagement" on extremist content:
Actually, through surveys and feedback, we’ve found that most viewers do not want to be recommended borderline content, and many find it upsetting and off-putting. In fact, when we demoted salacious or tabloid-type content we saw that watchtime actually increased by 0.5% percent over the course of 2.5 months, relative to when we didn’t place any limits.
Also, we haven’t seen evidence that borderline content is on average more engaging than other types of content. Consider content from flat earthers. While there are far more videos uploaded that say the Earth is flat than those that say it’s round, on average, flat earth videos get far fewer views. Surveys show that borderline content is satisfying to only a very small portion of viewers on YouTube. We’ve invested significant time and money toward making sure it doesn’t find its way to broader audiences through our recommendations system. Today, borderline content gets most of its views from sources other than non-subscribed recommendations.
Now, I will note that YouTube recently changed that final line. I found it last month when Evelyn Douek tweeted about it -- but she highlighted that final sentence which said something different at the time:
From YouTube's blog yesterday: "borderline content gets most of its views from other platforms that link to YT"
Yup! That is why a) we need to consider the internet as an entire ecosystem not platform-by-platform & b) YT shd release more info abt thishttps://t.co/ypsT5CCd0K pic.twitter.com/TV0WGZssu9
— evelyn douek (@evelyndouek) September 16, 2021
In the version she tweeted, the final line reads: "Today, borderline content gets more of its views from other platforms that link to YouTube." But, sometime after she tweeted that, YouTube changed it to read: "Today, borderline content gets most of its views from sources other than non-subscribed recommendations." That's... a little different. And at the very least, it makes me wonder why the change -- and at least highlights that there may be a difference between "subscribed recommendations" and "non-subscribed recommendations."
Still, there are questions about how people are finding YouTube videos -- and they do come from all over. As Douek highlights in a follow up tweet, the largest number of external views on Facebook are to YouTube. This doesn't absolve YouTube and its algorithm, but once again highlights how (1) all of this is a lot more complicated than people make it out to be, (2) the internet is an interconnected ecosystem, not just a few giant sites, and (3) the common wisdom you may have heard... might not be supported by the data.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: algorithms, myth debunking, outside links, radicalization
Companies: youtube
Reader Comments
Subscribe: RSS
View by: Time | Thread
But it is so much simpler to blame YT & demand they fix all of societies problems for us.
Humans have this problem where "everybody knows" things.
All sex offenders raped kids - Not even close.
YT radicalizes kids!! - Umm maybe turn off faux news for 5 min.
Talking about sex makes kids have sex - Uhhhh what?
The list goes on and on & its impossible to actually deal with these stupid ideas because some peoples paychecks depends on keeping the lie alive.
Try to point out all "sex offenders" aren't child rapists, just gets you painted as someone willing to sacrifice children to the monsters who ...(checks notes)... was drunk & pissed in an alley.
Assigning labels to things isn't the problem, the problem is humans demand that more and more things be stuffed into the label until its this massive thing that can't be stopped.
It's easier to just pretend that the label, that humans made stupidly huge, is to hard to handle or discuss.
That insisting our viewpoint of things is the only right viewpoint even as reality disproves it. (something something IIRC Colorado made birth control super easy for teens to get and the teen birth rate went way down). In a nation that believes its children can be addicted to video games, they also think that if we tell them its icky no kids will have sex... sorta stupid this humanity thing.
Sometimes we have to accept that what we WANT to believe isn't true, and more harm is caused by holding onto the beliefs while ignoring reality. (Something something Catholics STILL claiming those kids weren't diddled by priests & fighting tooth and nail to block any investigation because they KNOW it didn't happen despite medical evidence.)
[ link to this | view in chronology ]
People Get What They Seek
One of the best features of Google search, other search, and YouTube, Facebook, etc. is that people will find whatever it is they are seeking. That applies whether it's true or false, benevolent / benign / or beastly.
So, in these random walk experiments, what we're really testing is whether YouTube will take a neutral viewer, and feed them radical content, and the results seem to be no. That's good, but...
I'm more interested in an experiment where you take a YouTube viewer who starts with a slightly "disinformation" video request or inbound link, then where does the random walk take them. Because I think THAT's what's happening.
And this is fully in agreement with your point, Mike, that they see some disinfo elsewhere or hear about it on Fox News, then link-in or seek it on YouTube. But THEN what happens?
Anecdotally, my dumb friend was "just asking questions" about a flat earth about 7 years ago. One year later, he was sure of it, and had started into a bunch of other conspiracies. YouTube may not start people down the wrong path, but it does seem to provide them the "rabbit hole", should they start that way.
[ link to this | view in chronology ]
Re: People Get What They Seek
Yes, the studies don't seem to be testing the correct scenarios. The other scenario is where the platform allows an alt-right group to target people with ads based on some indicator of gullibility, the people engage with that content, which causes the content to be elevated for others in organic suggestions/search results. It only takes a few thousand people in a swing state to change the outcome of a statewide election. The radicalization doesn't need to be widespread to have a serious effect.
[ link to this | view in chronology ]
Re: People Get What They Seek
I'm more interested in an experiment where you take a YouTube viewer who starts with a slightly "disinformation" video request or inbound link, then where does the random walk take them.
"A random walk algorithm beginning at an Alt-Lite video and taking 5 steps randomly selecting one of the ten recommended videos will only be recommended a video from the Alt-Right approximately one out every 1,700 trips."
[ link to this | view in chronology ]
Re: People Get What They Seek
I'm more interested in an experiment where you take a YouTube viewer who starts with a slightly "disinformation" video request or inbound link, then where does the random walk take them. Because I think THAT's what's happening.
Read again. That's exactly what the study did.
[ link to this | view in chronology ]
Re: People Get What They Seek
You got my 'Insightful' vote just for the comment title alone, because that line was on repeat in my head the whole time I was reading the article.
[ link to this | view in chronology ]
Re: People Get What They Seek
It's almost like people will look for videos that confirm their pre-existing biases.
:O
[ link to this | view in chronology ]
Re: People Get What They Seek
"YouTube may not start people down the wrong path, but it does seem to provide them the "rabbit hole", should they start that way."
That...is arguably the most pushed misunderstanding about literally every social phenomenon to be blamed for societal ills. From the Sorrows of Young Werther to D&D, Fantasy, Science-fiction, Youtube, TikTok, all of the darn intarwebz and every invention to take place since the first time humankind managed to coax sparks out of flint.
That anecdotal friend of yours? He may or may not have been in a bad way from the get-go but if he later on spent 7 years primarily taking his information from conspiracy sites then he belonged in the same padded cell as the guy who started believing a D&D splatbook could teach him to do real magic.
Because face it, your anecdotal friend is dumb. Gullible. Willing to subscribe to a religion of fucknuttery relying on more interesting things than dry old factual reality.
[ link to this | view in chronology ]
Please
A definition of RADICAL.
Because that would mean every great person would be a radical.
Thinking outside the box? Thinking outside the Norm?
Everyone that dont see things the WAY' you do?
Giving corps and advertisers all the control is the norm?
EXCUSEEEEE MEEEE?
[ link to this | view in chronology ]
As you wish!
A radical is an atom, molecule, or ion that has an unpaired valence electron.
Remember, you DID ask.
[ link to this | view in chronology ]
Re: As you wish!
Also, they're generally considered harmful unless bound to a predefined transport chain shuttling them to where they're needed.
This should mean we can cure Trump cultists, religious fanatics and the church of Qanon with sufficient dosage of antioxidants and a vegetable-heavy low-carb diet.
[ link to this | view in chronology ]
Surely there is some way - perhaps serializing? - to avoid murdering words to ft twtr char lmts....
[ link to this | view in chronology ]
There are those who would call someone an "alt-right edge lord troll shouting Trump slogans" just for having a "wrong" opinion about a videogame.
[ link to this | view in chronology ]
Re:
For example?
[ link to this | view in chronology ]
Re: Re:
That is always the question. I've never seen anyone be called "an alt-right edge lord troll shouting Trump slogans" for having a constructive opinion about a videogame. There's been some heated debate about the content of something like, say 12 Minutes recently, but I see very few outright abusive moments going from the left to the right where it's not justified. I have, however, seen plenty of right wingers lose their shit because there's a game where a character is female, gay, lesbian, black, Asian, or any combination thereof.
So, what's the "wrong" opinion here? If you're uncomfortable with the depictions of indigenous populations, with the addressing of mental health issues, with complex ideas about a political future then there might be a conversation to be had. If the "wrong" opinion is is because some fictional mythical figures are not depicted as being white, then you probably earned your accusations...
[ link to this | view in chronology ]
Two responses.... If not true how do you explain my Uncle.
Second, Even Facebook research says a person becomes more radicalized.
[ link to this | view in chronology ]
Re:
1 - Your uncle is the brother of your father or mother.
2 - Why would Facebook do research on Youtube?
[ link to this | view in chronology ]
Re:
"Even Facebook research says a person becomes more radicalized."
From what?
Yeah, Facebook, Youtube, all the social networks - they allow you to find a silo of like-minded morons where the latest broken logic on how Antifa mind-controlled staunch patriots into shitting on the rotunda floor on jan 6th keeps bouncing around until they start believing it for real...
...but that holds true for every social gathering where people invite like-minded.
Social platforms online are a mediator which works to make connecting people easier in all ways. As such they don't radicalize at all. They just facilitate the mentally unfortunate in the same way they benefit the MENSA society debating ways to solve the global warming issue or terraform Venus.
The data emerging isn't a condemnation of social media. It's an indication that most countries and the US especially needs to up it's game when it comes to diagnosing and treating the mentally ill, feeding the hungry, and teaching the ignorant.
[ link to this | view in chronology ]
Furthermore... QAnon is proof that Facebook and YouTube radicalizes people.
People weren't seeking out QAnon material. They weren't radicalized Q supporters prior to YouTube and Facebook. Few if any learned about QAnon from a friend or bus stop. They weren't even instant converts after watching or viewing the one random. It was with repeated repetition that they became believers in the imaginary radical QAnon stories and videos.
[ link to this | view in chronology ]
Re:
There's a number of parties involved. Maybe YouTube has some culpability if the person is exposed to a video they'd never have found on their own, but if watching a few videos claiming that some unknown person is predicting things that never actually happen are the real truth and that Trump was sent by god to resurrect RFJ Jr in order to destroy paedophile rings being run out of imaginary pizza basements and destroy democracy in order to install your preferred dictatorship, that's 100% on you.
[ link to this | view in chronology ]
Re: Re:
"...but if watching a few videos claiming that some unknown person is predicting things that never actually happen..."
We'll always have that urge to come up with a single, understandable, specific and easily graspable reason as to why a thinking human being identical in form and capacity to ourselves would suddenly wrap tinfoil around their head and start screaming about how the liberal cannibal cult helmed by the Kenyan Muslim on behalf of the NWO is turning all teh frogs gay because chemtrails and fnord.
I mean, psychologically it's a lot easier to blame youtube, that D&D splatbook, fantasy or the ready access to imagery of nudity than it is to accept that it's all just an innate defect of humanity which is warded off only by a solid focus on reason and logic during our formative years.
[ link to this | view in chronology ]
Re: Re: Re:
Yeah, but you have to be predisposed to believe such things. What's interesting is that apparently some people can watch a half-assed rant from someone sitting in their SUV about things that can be debunked in moments and get drawn to believe everything they say without question, but others can roll their eyes and click "next" before they stopped their opening rant.
"Kenyan Muslim"
The birther stuff was especially interesting to me. Obama has been required to provide more evidence of his eligibility than any other president in history, eventually volunteered even more than that, has never had anything other than his name suggest that he's not Christian, and the conspiracy required for all of those facts to be true require you to believe that people in the 1960s were working to guarantee a black president. Yet some people (including a certain orange man) believed that without question, and claim that the reason is not racism.
"the NWO"
Similarly, most of those conspiracies are essentially variations on the Protocols Of The Elders Of Zion, which is literally Russian anti-semitic propaganda. Post-Holocaust, some people have understood you can't just come out and say that directly, but it's the same thing.
[ link to this | view in chronology ]
Re: Re: Re: Re:
"Yeah, but you have to be predisposed to believe such things."
Two things factoring into this; "Religious Upbringing". If you've been taught, through your formative years, that everything you can observe or calculate is a lie and that everything you need to pay credence to is backed only by a Big Holy Book...that's your predisposition, right there. Given the young turd-to-be raised in a sectarian religious home has also likely been taught to take it on similar faith that "liberals BAD" it's not exactly a logical stretch to understand why they will blindly consume any assertion made against the focus of their hatred.
The second part is grievance addiction. The addict will naturally be predisposed to accepting, blindly, any assertion which serves to provide them their daily dose of hatred. And anyone today in the cult of Trump, among the followers of Qanon or generic republican Karens obsessed with entitlement, have all been raised to hate whoever the current bogeyman or other is.
"Obama has been required to provide more evidence of his eligibility than any other president in history..."
...and 90 million americans found great solace in spewing bile and venom at him because he was the focus of so very many of their primary suppliers of grievance. A black man. Intellectual. Liberal.
"Similarly, most of those conspiracies are essentially variations on the Protocols Of The Elders Of Zion, which is literally Russian anti-semitic propaganda."
The jews have caught most of the flak through history mainly because they made such good others. A people with close-knit communities of quaint and obviously different religious and social practices, often highly educated, who also have the bad taste to often become eminently successful and prosperous? Prime target for envy and a convenient scapegoat to blame for all ills.
But they are hardly the first. The romans went after the christians for the same reasons. The carthaginians went after the romans. etc et ad nauseam.
This philosophy of scapegoating is usually the core part of the viral social meme of millennialist belief. Who the scapegoat is is as interchangeable as a smartphone casing. What matters is that it taps into making people addicted to hatred from a young age, raising humans through their formative years on a steady dosage of hate-mediated adrenaline.
The predisposition isn't about some sort of preunderstanding or preconceived notion - it's about being raised a junkie hooked on hating someone.
And that's why someone like Alex Jones or Ann Coulter just has to holler trigger words to get the audience instantly feeling that rush of dopamine and adrenaline as their anger spikes. Saner people just listen and go "What the actual fuck?" with raised eyebrows. The addicts, though, will defend their theory of "teh gay frogs" and "Cannibal Killary" to their deaths since that's what they get their current shot of natural opiates from.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
I would look for some research to back up that claim if I were you. I am me and I looked and found this:
"The results showed that believers and non-believers did not differ in the belief in conspiracy theories."
https://bpspsychub.onlinelibrary.wiley.com/doi/abs/10.1111/bjso.12314
[ link to this | view in chronology ]
Re:
No, it isn't. QAnon is proof that some people are gullible morons who'll intentionally throw themselves down the rabbit hole if exposed to something that either appeals, in some way, to their beliefs, or confirms them outright(i.e. confirmation bias).
There's no evidence that Youtube or Facebook can "turn" a middle-of-the-road normie through their algorithms.
[ link to this | view in chronology ]
Re:
More like some members of the republican party are working overtime to radicalise people, and Facebook and YouTube are convenient deflections.
[ link to this | view in chronology ]
Re:
"People weren't seeking out QAnon material. They weren't radicalized Q supporters prior to YouTube and Facebook."
Cast your mind back to the days of George W Bush & Dick Cheney. Tell me how this base of radical "hate the lib" fanatics ordering their "freedom fries" while defending Cheney's advocacy of torture or the lie about Iraq having WMD's...weren't as loud and strident as they are today. Trump pulled out all the plugs and opened the floodgates, sure...but everything he released was already there and had been building up for some time.
Sure these people were looking for Q. They'd been trained and taught for their entire life that the prime cause for all their ills were the gahd-damned damyankee libs.
Qanon is just the latest prophet in a long line, providing a steady parade of reasons to keep hating liberals. A worthy successor of Rush Limbaugh.
You can't bring a sensible person to believe in Q. Ever. No matter how many times that message repeats. But the damaged? The deranged? Some 70-90 million people in the US republican base who have spent their entire lives addicted to grievance? Oh, they'll believe it the same way ANY junkie believes in their next fix. Unconditionally and without questioning.
This is the point so many keep missing. Yeah, very little of what the republican base believes in makes any sense at all. And that's because that belief isn't based in logic. It's based in the fact that persistently elevated levels of adrenaline is more addictive than most opioids on the market.
This scapegoat strategy where you point out a target to be hated and then feed the addiction of your voter base until they're so hooked on it they're willing to believe anything which keeps feeding that hatred has been used quite a lot throughout history. Hitler's "history" of events, with his "Big Lie" and "Fake News" was so stunningly successful republican leaders just had to copy his methods of convincing the masses and could thrive since the 80's on not ever having any actual platform other than "No".
[ link to this | view in chronology ]
I don't know what the researchers defined as "anti-woke," but I follow multiple channels that regularly speak out against woke BS and it's mostly lamenting Hollywood turning into a boring propaganda factory and calling out bigotry masquerading as diversity. And of course opposing crazy Twitter hate wars and cancel culture, as the core of wokeness.
They're not all American. None of them are overly political. For many, the point is that everything is already too politicized as it is, something I strongly agree with, especially in entertainment. They're not all right-wing and none of them are alt anything. In fact some of them are explicitly left-wing. Obviously liberal left, as opposed to woke. Even Bill Maher complains about the wokedom often enough, totally love the guy.
To me, "anti-woke" means common people promoting practical common sense over political division, hypocrisy, propaganda and extremism. Against "radicalizing" rather than for it.
[ link to this | view in chronology ]
Add Youtube to the list.
Red Scare
"Devil's Music"
Switchblades
Dungeons and Dragons
Rock Music (Again)
Dangerous Dogs
Violence in Video Games
[ link to this | view in chronology ]
Common wisdom suggests that the algorithm only shows you what you want to see more of. Regardless of the platform.
I suggest one actually try this out. Find a cute hamster video (or pet of choice) and see where it leads to. The choice of platform is irrelevant.
Guess what? You're gonna get more hamster (or cute pets) related content. Usually.
As for radicalization, man, I sure as hell don't think it may be because of how powerless we feel in a time where we are knowing that we're losing our freedoms to corps, only to die in their pointless culture wars or someshit?
[ link to this | view in chronology ]