New Study Suggests That YouTube's Recommendation Algorithm Isn't The Tool Of Radicalization Many People Believe (At Least Not Any More)

from the well,-look-at-that dept

It's become almost "common knowledge" that various social media recommendation engines "lead to radicalization." Just recently in giving a talk to telecom execs, I was told, point blank, that social media was clearly evil and clearly driving people into radicalization because "that's how you sell more ads" and that nothing I could say could convince them otherwise. Thankfully, though, there's a new study that throws some cold water on those claims, by showing that YouTube's algorithm -- at least in late 2019 -- appears to be doing the opposite.

To the contrary, these data suggest that YouTube's recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content. Instead, the algorithm is shown to favor mainstream media and cable news content over independent YouTube channels....

Indeed, as you read through the report, it suggests that YouTube's algorithm if it has any bias at all, it's one towards bland centrism.

The recommendations algorithm advantages several groups to a significant extent. For example, we can see that when one watches a video that belongs to the Partisan Left category, the algorithm will present an estimated 3.4M impressions to the Center/Left MSM category more than it does the other way. On the contrary, we can see that the channels that suffer the most substantial disadvantages are again channels that fall outside mainstream media. Both right-wing and left-wing YouTuber channels are disadvantaged, with White Identitarian and Conspiracy channels being the least advantaged by the algorithm. For viewers of conspiracy channel videos, there are 5.5 million more recommendations to Partisan Right videos than vice versa.

We should also note that right-wing videos are not the only disadvantaged groups. Channels discussing topics such as social justice or socialist view are disadvantaged by the recommendations algorithm as well. The common feature of disadvantages channels is that their content creators are seldom broadcasting networks or mainstream journals. These channels are independent content creators.

Basically, YouTube is pushing people towards mainstream media sources. Whether or not you think that's a good thing is up to you. But at the very least, it doesn't appear to default to extremism as many people note. Of course, that doesn't mean that it's that way for everyone. Indeed, there are some people criticizing this study because it only studies non-logged in user recommendations. Nor does it mean that it wasn't like that in the past. This study was done recently, and it's been said that YouTube has been trying to adjust its algorithms quite a bit over the past few years in response to some of these criticisms.

However, this actually highlights some key points. Given enough public outcry, the big social media platforms have taken claims of "promoting extremism" seriously, and have taken efforts to deal with it (though, I'll also make a side prediction that some aggrieved conspiracy theorists will try to use this as evidence of "anti-conservative bias" despite it not showing that at all). Companies are still figuring much of this stuff out and insisting that because of some anecdotes of radicalization that it must always be so, is obviously jumping the gun quite a bit.

In a separate Medium blog post by one of the authors of the paper, Mark Ledwich, it's noted that the "these algorithms are radicalizing everyone" narrative also is grossly insulting to people's ability to think for themselves:

Penn State political scientists Joseph Philips and Kevin Munger describe this as the “Zombie Bite” model of YouTube radicalization, which treats users who watch radical content as “infected,” and that this infection spreads. As they see it, the only reason this theory has any weight is that “it implies an obvious policy solution, one which is flattering to the journalists and academics studying the phenomenon.” Rather than look for faults in the algorithm, Philips and Munger propose a “supply and demand” model of YouTube radicalization. If there is a demand for radical right-wing or left-wing content, the demand will be met with supply, regardless of what the algorithm suggests. YouTube, with its low barrier to entry and reliance on video, provides radical political communities with the perfect platform to meet a pre-existing demand.

Writers in old media frequently misrepresent YouTube’s algorithm and fail to acknowledge that recommendations are only one of many factors determining what people watch and how they wrestle with the new information they consume.

Is it true that some people may have had their views changed over time by watching a bunch of gradually more extreme videos? Sure. How many people did that actually happen to? We have little evidence to show that it's a lot. And, now, there is some real evidence suggesting that YouTube is less and less likely to push people in that direction if they're among those who might be susceptible to such a thing in the first place.

For what it's worth, the authors of the study have also created an interesting site, Recfluence.net where you can explore the recommendation path of various types of YouTube videos.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: algorithms, engagement, radicalization, recommendation algorithm, recommendations
Companies: youtube


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    aerinai (profile), 6 Jan 2020 @ 10:56am

    If Thanksgiving Dinner has tought us anything...

    Your crazy uncle was crazy before YouTube.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Jan 2020 @ 12:06pm

      Re: If Thanksgiving Dinner has tought us anything...

      Yeah - the problem with social media isn't technical but the people using it. But as always it is easier to blame the scary new thing instead of considering that the problem is humans and a long preexisting condition.

      link to this | view in chronology ]

      • icon
        Ben L (profile), 6 Jan 2020 @ 4:38pm

        Re: Re: If Thanksgiving Dinner has tought us anything...

        I try explaining this to some of my friends, and the response I got was that I was letting big companies 'off the hook'. So the question is where does human behavior stop and corporate responsibility start? Youtube is an interesting test case since we have legacy media outlets arguing more responsibility is not promoting potentially dangerous voices, while individual creators argue this means better treatment and more creative freedom. Both these things are desirable in theory, but they also can conflict.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 7 Jan 2020 @ 7:15am

          Re: Re: Re: If Thanksgiving Dinner has tought us anything...

          "corporate responsibility"

          I did not see any sarcasm tag but I still assume it is.

          Gotta love those corporate mission statements proclaiming their community spirit.

          link to this | view in chronology ]

    • identicon
      Bobvious, 6 Jan 2020 @ 2:17pm

      Re: If Thanksgiving Dinner has tought us anything...

      Is your crazy uncle married to Aunty Social Media?

      link to this | view in chronology ]

  • icon
    Stephen T. Stone (profile), 6 Jan 2020 @ 12:12pm

    Forest, trees, etc.

    The issue shouldn’t really be with YouTube’s algorithms pointing people to extremist/radicalizing content. The issue should be with YouTube hosting such content in the first place.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Jan 2020 @ 12:29pm

      Re: Forest, trees, etc.

      Detecting such content by algorithm is difficult, especially in the mix of fact, satire and fiction that is YouTube. It looks like the Algorithm is being biased towards publishing companies, and away from individuals, and that is a bit of a problem in its own right.

      link to this | view in chronology ]

    • icon
      That One Guy (profile), 6 Jan 2020 @ 12:56pm

      Be careful what you wish for

      The issue shouldn’t really be with YouTube’s algorithms pointing people to extremist/radicalizing content.

      Absolutely, YouTube really should know better than to host pro-LGBT content, content that presents such subjects as anything related to sex, racial and/or gender equality or the wrong religion(or even worse, no religion) as acceptable, content that encourages the young to question their elders, and don't even get me started on copyright infringing works...

      Be very careful when talking about how platforms shouldn't be allowing 'extremist/radicalizing content', because I guarantee that somewhere out there are people and entire groups that would consider any or all of the above to fall into those categories, and would jump at the wedge that idea would provide to push with all their might.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 6 Jan 2020 @ 1:10pm

        Re: Be careful what you wish for

        Don't feed the trolls. Just flag his authoritarian, censorious propaganda and leave it at that.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 7 Jan 2020 @ 7:18am

          Re: Re: Be careful what you wish for

          Don't even try to engage in the discussion, just throw around a few accusations.

          link to this | view in chronology ]

      • icon
        Stephen T. Stone (profile), 6 Jan 2020 @ 1:27pm

        Damn. That is an excellent point. 👍

        link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 8 Jan 2020 @ 2:34am

          Re:

          "Damn. That is an excellent point. "

          It is indeed. It's the old question as to what constitutes "offensive" culture all over again. Once we start trying to decide what sort of information is objectionable we always end up with the same tired old question of WHO we'd trust to make that call. Knowing full well just how many people in impeccable authority we are aware of who would cheerfully leap at the chance to put <insert ungodly and heathen ways here> back in the bottle and out of the public view for good.

          I was surprised to see you falling in a logic hole like that, Stephen.

          link to this | view in chronology ]

          • icon
            Stephen T. Stone (profile), 9 Jan 2020 @ 7:32am

            Yeah, I fucked up, but at least I can learn from the mistake. 👍

            link to this | view in chronology ]

            • icon
              Scary Devil Monastery (profile), 10 Jan 2020 @ 7:29am

              Re:

              "Yeah, I fucked up, but at least I can learn from the mistake."

              I think it's the sort of mistake everyone makes before they get properly schooled by calmer heads.

              It's all too easy to watch some bigoted misogynistic douchebag sociopath pour the fruits of his feverish imagination onto a forum thread and start going "Oh, if someone would just keep people like him from speaking...".

              link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Jan 2020 @ 12:21pm

    right wing conspiracy

    Pushing people towards centrist views is radicalising them against us.

    link to this | view in chronology ]

    • identicon
      Big Daddy War Fucks, 7 Jan 2020 @ 7:24am

      Re: right wing conspiracy

      The steady drift to the right has been going on for decades. The 60s saw a slight improvement in civil rights while the 70s saw an end to our involvement in nam - but it was right back to it after that. Gotta start a few more wars to boost those dividends.

      Look at where it has brought us.

      link to this | view in chronology ]

  • icon
    Anonymous Anonymous Coward (profile), 6 Jan 2020 @ 12:29pm

    How not to do Monday morning quarterbacking

    "Just recently in giving a talk to telecom execs, I was told, point blank, that social media was clearly evil and clearly driving people into radicalization because "that's how you sell more ads" and that nothing I could say could convince them otherwise. "

    If those telecom execs knew what the execs at Google know, then they they would be Google. Look at how well they are doing with the extra-curricular businesses ie. entertainment, streaming, etc.. Seeing what is going on and being able to do it are very different things.

    And, as the AC said above, it isn't about the algorithm, it is about human nature, something else those telecom execs don't understand.

    link to this | view in chronology ]

  • icon
    That One Guy (profile), 6 Jan 2020 @ 12:39pm

    Mind like a locked box

    Just recently in giving a talk to telecom execs, I was told, point blank, that social media was clearly evil and clearly driving people into radicalization because "that's how you sell more ads" and that nothing I could say could convince them otherwise.

    Well, I suppose it was nice of them to admit up front that it would be a waste of time talking to them on the subject.

    The question at that point I suppose would be what had them so convinced? What data were they looking at, was it simply the usual 'social media bad' talking point, or, rather less generous, if it was a matter of projection, a case of 'this is how we get attention so obviously they're doing it too'.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Jan 2020 @ 1:09pm

    Just recently in giving a talk to telecom execs, I was told, point blank, that social media was clearly evil and clearly driving people into radicalization because "that's how you sell more ads"

    So you sell more ads by pushing content that appeals to less people? And this guy calls himself a businessman?

    What a maroon!

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Jan 2020 @ 1:40pm

      Re:

      As if ads weren't more mainstream than... mainstream. One would need radical ads to appeal to radical audiences, and while there are some, there are simply not that many.

      People have been spouting bs about clickbait, radicalization, and ad revenue for so long, most talk about these things has zero real meaning and is seriously divorced from reality. They may be pushing a story, but they aren't using the concepts correctly whatsoever. (Sometimes purposely, for sure.)

      link to this | view in chronology ]

  • icon
    bhull242 (profile), 6 Jan 2020 @ 3:11pm

    Just recently in giving a talk to telecom execs, I was told, point blank, that social media was clearly evil and clearly driving people into radicalization because "that's how you sell more ads" and that nothing I could say could convince them otherwise.

    Well, people can be idiots. (Yes, I’m aware of the massive understatement there.)

    Also, this actually makes me question telecom execs’ ad policies.

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Steve Marz, 6 Jan 2020 @ 3:17pm

    Techdirt lies again

    I am logged it to YouTube all the time. I am subscribed to channels like styxxhexanhammer666, adam green know more news, black pigeon speaks. Yet I never ever get informed from YouTube thanks to there algorithms about my videos. You are such liars techdirt. All you favor is allowing your readers to do illegal things on the internet like copyright infringement and help the sex slave trafficking industry by lying about SESTA. The CASE ACT is coming soon and I am calling on my senators to pass it. Being a racist is not illegal but copyright infringement and sex trafficking is. Keep that in mind techdirt when your lying reporters are thrown in jail by are POTUS Donald J Trump

    link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 6 Jan 2020 @ 3:25pm

      Being a racist is not illegal but copyright infringement and sex trafficking is.

      Bold of you to low-key admit you’re a racist, but go off I guess.

      link to this | view in chronology ]

      • This comment has been flagged by the community. Click here to show it
        identicon
        Henry Goldberg, 6 Jan 2020 @ 9:56pm

        Re: techdirt support illegal behavior that is against the law

        Fact is fact. Being a racist is not illegal. Copying music and movies is and that is what techdirt supports. LAWLESSNESS

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 6 Jan 2020 @ 11:48pm

          Re: COPYRIGHTS BEST AND BRIGHTEST ON FULL DISPLAY HERE

          WOW bro!

          Just WOW!

          Please continue...

          link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 7 Jan 2020 @ 3:45am

          Not that you care, but copying isn’t illegal in and of itself. Context determines illegality. Copying a song you made, in any context, isn’t illegal. Copying a song someone else made isn’t illegal if, say, you’re copying that file to a personal, private backup. Copying a song someone else made by downloading or uploading a copy without permission? Yeah, that’s technically illegal, but it’s more a civil matter than a criminal one.

          And yeah, being a racist isn’t illegal — but neither is punching yourself in the face, which is just as much a form of self-harm as is hating an entire group of people based on skin color, stereotypes, and someone else’s ideas of racial superiority.

          link to this | view in chronology ]

          • icon
            That One Guy (profile), 7 Jan 2020 @ 8:15am

            Re:

            Personally I find it rather entertaining that they are trying to conflate 'legal' with 'moral' in order to portray being racists as 'better' simply because it's legal. Plenty of abhorrent things throughout history have been legal, but that didn't mean those that did them were in any way moral or any less reprehensible from that legality.

            link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 8 Jan 2020 @ 2:44am

          Re: Re: techdirt support illegal behavior that is against the la

          "Being a racist is not illegal. Copying music and movies is and that is what techdirt supports."

          Unmistakably Bobmail/Jhon/Blue. No one else would think they were arguing against copying media by raising actual racism as an argumental lever. Because law.

          The copyright cult, ladies and gents. For when you miss the good old 15th century inquisition.

          link to this | view in chronology ]

      • icon
        Scary Devil Monastery (profile), 8 Jan 2020 @ 2:41am

        Re:

        "Bold of you to low-key admit you’re a racist, but go off I guess."

        ...and somehow trying to use that low-key admission as a argumental lever to wedge "copyright infringement" into the discussion and link it to trafficking. Shall we play "Spot Blue/Jhon/Bobmail"? He's the only one who could twist so many generally offensive implications into one sentence and still believe he was arguing in favor of something.

        link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Jan 2020 @ 5:08pm

      Re:

      How's that Fox Rothschild defense fund coming along, John Smith?

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Jan 2020 @ 7:19pm

      Re: blue balls rides again

      “are POTUS”

      Magnifico! 💋👌🏽

      link to this | view in chronology ]

    • icon
      bhull242 (profile), 9 Jan 2020 @ 11:21pm

      Re: Techdirt lies again

      I am astounded. Do I need to teach you some basic logic? YouTube’s faulty algorithms have no connection to FOSTA or CASE.

      link to this | view in chronology ]

  • icon
    Norahc (profile), 6 Jan 2020 @ 3:31pm

    Keep that in mind techdirt when your lying reporters are thrown in jail by are POTUS Donald J Trump

    I'm not even going to try and explain to you your massive misunderstanding of how algorithms work because quite frankly, a brick wall has a better chance of understanding it.

    But your statement about POTUS arrresting people is beyond laughably stupid. Despite your fervent fantasy, our President does not have the power to arrest anyone, or even to order their arrest because he wants to.

    link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 6 Jan 2020 @ 5:01pm

      Try telling that to Donald “Article II means I can do whatever I want” Trump.

      link to this | view in chronology ]

      • icon
        Norahc (profile), 6 Jan 2020 @ 5:24pm

        Re:

        Just because the trolls share the same delusions as the President doesn't make it reality.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 7 Jan 2020 @ 7:28am

          Re: Re:

          I look forward to the donny perp walk.

          link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 8 Jan 2020 @ 2:48am

          Re: Re:

          "Just because the trolls share the same delusions as the President doesn't make it reality."

          Eh, Bobmail/Jhon/Blue has staunchly held to that delusion ever since his days on Torrentfreak. Remind me to dig up some of his old gems on how ip tracing pirates should be held the same as the secret service tracing terrorist death threats against the POTUS. Because law.

          link to this | view in chronology ]

    • identicon
      Anonymous Coward, 7 Jan 2020 @ 1:46am

      Re:

      But your statement about POTUS arresting people is beyond laughably stupid.

      But his ability to order someone on foreign soil killed is extremely scary, and maybe makes some people think he can order anybody arrested.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 7 Jan 2020 @ 7:30am

        Re: Re:

        There is a huge difference between what is allowed by law and what actually happens. Excuses are plenty, but nothing changes - it's all a bunch of bullshit.

        link to this | view in chronology ]

  • icon
    NoahVail (profile), 6 Jan 2020 @ 8:07pm

    Anti-Radicalization

    What's it like to be radicalized by endless, pointless, loud, eye-ball hooking videos that aren't ever something I'd want to watch?

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 7 Jan 2020 @ 2:32am

    I can confirm it

    A while ago, there were elections in the EU and Youtuber Rezo produced a video "destruction of the CDU" (biggest German party). It was quite popular with many million views but when I tried to find it on YT, it was quite hard. I literally typed the title, but all I got were criticisms, mostly from mainstream media.

    link to this | view in chronology ]

  • identicon
    R/O/G/S, 7 Jan 2020 @ 8:13am

    Like I said...

    note that right-wing videos are not the only disadvantaged groups. Channels discussing topics such as social justice or socialist view are disadvantaged by the recommendations algorithm as well.

    link to this | view in chronology ]

  • icon
    Ninja (profile), 7 Jan 2020 @ 9:41am

    There is evidence that social networks (including youtube) have done quite the damage in many aspects around the world, specially regarding elections. However, it seems the owners (Google etc) are taking steps to curb these 'vulnerabilities'.

    That said, the lack of proper education focusing on critical thinking probably played a much more important role. We need to rethink our education systems to produce thinking citizens, not adults that know how to take tests.

    link to this | view in chronology ]

  • identicon
    bobob, 7 Jan 2020 @ 1:57pm

    I wasn't aware that youtube's recommendation algorithm was a tool for anything. I rarely see a recommendation that matches anything I am interested in. Just throwing out a bunch of things which are vaguely similar in some way, doesn't seem to compel me to watch any of them.

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Irv Rubin, 7 Jan 2020 @ 5:19pm

    re: is grossly insulting to people's ability to think for themselves

    This is Mike Masnicks stamp of approval on the Anti Defamation League deploying military derived psychological operations on an unsuspecting public, via programs like Moonshot CVE.

    Its one thing to see a static advertisement on a webpage, and another thing entirely to be subjected to redirection, which is the equivalent of a man-in-the-middle attack on the viewer, and the split second thought processes of choice .

    Redirection via algorithm or other methods like page hijacking is like a guy in a trench coat jumping in front of a jogger and flashing his wang, and later, claiming he was merely trying to sell some watches.

    link to this | view in chronology ]

    • icon
      bhull242 (profile), 9 Jan 2020 @ 11:25pm

      Re:

      I’m unfamiliar with social media or the like doing what you’re talking about, so I have no idea how this relates to the article.

      link to this | view in chronology ]

  • identicon
    Rick O'Shea, 7 Jan 2020 @ 8:33pm

    Do non-radical movies and discussions make people less radical?

    Is there actually any real science behind this whole idea that non-"radical" people can be turned into terrorists by watching "radical" information videos? If so, can someone point me to it please.
    In my experience, people who constantly express "radical" ideas, have always been angry and reactionary about the topics they discuss. I don't think people become radicalized by other people's comments, but simply become emboldened by the sense that they are not alone in their way of thinking about something and thus express their own ideas more openly and more often. Since many radically thinking people would normally be silent due to social pressure, this "acceptance" by a discussion group could lead to that person's sudden verbose expression of these ideas, and this could give the impression that they have become "radicalized" by speech, but in truth they already were radical but silent.
    I can't help but think that this whole "radicalization" thing is just another phony "anti-terrorist" shtick that allows new laws to be written specifically to limit public access to information that the Government wants kept under wraps.

    link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 8 Jan 2020 @ 2:56am

      Re: Do non-radical movies and discussions make people less radic

      "Is there actually any real science behind this whole idea that non-"radical" people can be turned into terrorists by watching "radical" information videos? If so, can someone point me to it please."

      There isn't. On the contrary it's been effectively debunked any number of times.

      That the concept still lives as a meme at all can mainly be blamed on various religious or "moral police" groups who feel that people talking about sex will lead to filth, indecency, and to decent church-going, god-fearing heterosexuals magically turning LGBTQ. Naturally reading about muslims, Marx or Martin Luther King Jr. will see you becoming an ISIS cultist, radical communist, or scofflaw anarchist by the same broken logic.

      In the end it's always the unimaginative feeble-minded bigot who ends up believing that discussing something will lead to becoming that something.

      link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 8 Jan 2020 @ 7:15am

      Is there actually any real science behind this whole idea that non-"radical" people can be turned into terrorists by watching "radical" information videos?

      It’s mostly anecdotal experience. But the videos aren’t the starting point for radicalization anyway — they’re the bridge to it. Radicalization tends to start small. Y’know, one person talking to another person or two, even if only through the Internet. Someone may hold an idea that could become “weaponized”, but without the proper funneling, that someone would never become radicalized. To wit: every “terrorist” arrested by the FBI that was first radicalized by the FBI for the purpose of being arrested.

      A long road exists between “I don’t like Muslims” and “I want to burn down all the mosques in the state”. No one starts walking it without a guide.

      link to this | view in chronology ]

      • identicon
        Sorry for InteROGSerating, 9 Jan 2020 @ 1:06am

        Re: its not just the FBI

        I totally agree with your analysis, except for your omission of other agencies that radicalize these guys, which is what this hide the ball CVE scheme is .

        The evidence of that can be found in the many cases such as William Atchisson, who had online contact with actual British intel types, including an MP before he went ballistic.

        Or, any of many whose online presence is deleted, after the radicalizers have their way with these poor souls online.

        Other evidence can be found with SOCMINT analysis by tracing the online “friends ” of the radicalized victims, and also in the fact that these agencies delete the shooters /car crashers / incel butter knifers Facebook /Twittercetera accounts after the fact to hide the evidence of this unchecked tactical assault on individuals.

        CVE really is bizarre, and crosses every line of ethical online conduct.

        link to this | view in chronology ]

      • identicon
        Rick O'Shea, 10 Jan 2020 @ 10:10pm

        Re: Joe + Video = Terrorist

        Thought I'd do a bit of research myself and It appears that there is no hard evidence of this connection available on-line beyond opinions - like ass-holes of course, everyone seems to have one. The term itself appears to be something cooked up in a think tank by government types - like the term 'insurgent' - a little Hegelian social magic I guess. As to the videos being a bridge to radicalization, I would hazard a guess that they don't hold a candle to the bridge created by a bad government doing shitty stuff to its citizens. My ass-hole, er, opinion, is that radicalization starts at home. If your folks are anti-black, or anti-muslim, or anti-whatever, there's a good chance you will be too, simply from constant exposure. My gut tells me its people who already have a hate-on for something who "get radicalized" from discovery of a support group or film work and not Joe the Tolerant. In other words, there aint no such animal, but its great for allowing the creation of new "For The Children" population control laws. The FBI Make-A-Terrorist program doesn't actually "radicalize" anyone either, since it just targets someone of low intellect who has already made it clear online what his or her primary beef is. The FBI just plans out the act-of-terror and offers to supply the nutter with cash and bombs if he'll pull it off for them. That always seemed to be illegal, but then, illegal seems to be getting a whole new paint job these days. Well, that's my asshole and until contrary evidence pops up, I'm sticking to it.

        link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Paddy O’Rice, 8 Jan 2020 @ 12:46am

    informed consent for Influence Operations?

    Well, we fundamentally agree that the whole "radicalization" thing is just another phony "anti-terrorist" shtick that allows new laws to be written specifically to limit public access to information that the Government wants kept under wraps

    But in my experience with people actually targeted by these dark programs -and for the record, many Somalis, Muslims, Asians, etc. - these programs are not what are being described in the available public data, or studies, for all of the same reasons: investigative privileged information, nashunul secrecy, tribal-sectarian bullying disguised as genuine concern, etc.

    So, I think there needs to be a time when we actually do journalism and ask the actual targets of these programs how the programs actually work.

    So, talking to the victims of these programs firsthand is a start, and I have done that. In fact, I privilege their narratives above all others.

    These victims report:

    • strange shenanigans on their cellphones during gatherings or protests, including the redirection of their web searches into illicit content via page hijacking and more. This is a HUGE and quite common complaint

    • internet slowdowns and total corruptions of their communications, including especially intercepted emails that never arrive at their intended destinations. This is also quite common

    • extremely bizarre offline surveillance after playing video games online, or participating on Twitter, etc.

    • being followed everywhere they go by community policing agents and NGOs. This is also extremely common

    • activists reporting that their online conversations are repeated back to them in meatspace by said community policing scum

    • break -ins to their homes and cars

    • computer implants, and endless hacking directed at their communications

    I mean, the list is endless of how these CVE programs are integrated together, via agencies, Infragard agents and NGOs, and much much deeper than what this study portrays, and that, starting with speech policing via inferred intents of a speaker, and lots of word twisting, interpreted by paranoid religious nutjobs and their hench persons.

    That biased study for example has the interesting category of “anti-theists” listed as a targeted group.

    I mean anti-theists? Really? Talk about a minority group.

    What kind of person states that atheism, secularism, or other non-tribal religious or NON-religious person or group as ANTI something, unless the researcher themselves has taken a stand against those who enforce religious constructs upon others?

    So, radicalization is in fact an entire political process, rather than an issue of an innocent and binary, and simplistic mechanistic algorithym redirecting content, because in fact and practice these are tied to offline parallel colliding investigations under the ATAP model that uses the ponerology and the pseudo-science of threat assessment, with real world police action targeted constantly at individuals; and, every mass shooter who was harrassed this way before they went ballistic.

    Its Really Obvious Group Stalking, targeting that one guy who values the right to be left alone, and other rights that free societies hold dearly, and that, from a sociological and psychological perspective has all been studied, but not highlighted in media in any substantive way.

    So getting at the kernel to hack the false basis of the radicalizers rationale requires sociology to debunk their goal post shifting narratives.

    A good place to start is in psychology, and the well documented issue of consent, wherein we discover that internet radicalizers are NOT utilizing protocols of informed consent during online influence operations, and that even psychologists consider informed consent as the key to changing the brains functions, beliefs, and behaviors, not internet middle-man attacks:

    https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4196540/

    link to this | view in chronology ]

    • icon
      bhull242 (profile), 9 Jan 2020 @ 11:28pm

      Re: informed consent for Influence Operations?

      Are you fundamentally incapable of talking about what’s actually in the article? Are you incapable of finding websites that actually support what you’re talking about?

      link to this | view in chronology ]

    • identicon
      Rick O'Shea, 10 Jan 2020 @ 10:19pm

      Re: informed consent for Influence Operations?

      Sorry dude. You completely lost me there.
      And now I gotta go look up ponerology.

      link to this | view in chronology ]

    • icon
      bhull242 (profile), 11 Jan 2020 @ 4:29pm

      Re: informed consent for Influence Operations?

      By the way, antitheism is a thing. You know how a lot of people think that atheism is knowing there is no god (rather than saying there is no proof that one or more exist and merely lacking belief) and are against religion (rather than just not believing? The people who actually fit that description are anti-theists. They do exist, though they’re probably less common than atheists. Including them as a group is likely out of thoroughness than some faith-based bias. I believe antitheists may be a subset of atheists (like how agnostic atheists are a subset of atheists and agnostics), but I could be wrong. If so, atheism would be lacking belief in God or gods or some other deity(s), while antitheism is the belief that no god(s) or deity(s) exist. Regardless of which definition you use, there is still a distinction between antitheism and atheism.

      On a related note, the other false description many religious people give atheists is “people who hate God but do actually believe he exists”? People like that also exist, and they are called “maltheists”. Unlike atheism or antitheism, which are not subsections of theism, a maltheist does in fact involve the belief in the existence of a god(s) or other deity(s); they just don’t believe that any such deity is remotely benevolent. On the contrary, they believe that the deity(s) is/are more malicious than helpful.

      And just to round this up, an anti-Christian may be a maltheist, some other theist, atheist, or antitheist, but it specifically means that person is opposed to Christianity. It doesn’t actually say much about their religious beliefs (other than not being a Christian of course). As for agnostic, it’s a philosophy that the existence of any God, god(s), goddess(es), or other deity(s) cannot be proven, disproven, or known to be true or false. Again, this doesn’t necessarily say anything about one’s religious beliefs per se; agnostics may be Christian, Jewish, Muslim, pagan, neo-pagan, Hindu, Buddhist, Shinto, Sikh, New-Age, Mormon, voodoo, atheist, antitheist (unless “antitheist” means “believes that the existence of any god(s) can be and has been disproven”), pantheist, pan-en-theist, or maltheist. The opposite would be “gnostic”.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.