Removing Terrorist Content Isn't Helping Win The War On Terror
from the misguided-efforts dept
The terrorists are winning.
This shouldn't come as a surprise. The War on Drugs hasn't made a dent in drug distribution. Why should the War on Terror be any different? Two decades and several billion dollars later, what do we have to show for it? Just plenty of enemies foreign and domestic.
While politicians rail against "terrorist content," encryption, and the right for people to remain generally unmolested by their governments, they're leaning hard on social media platforms to eradicate this content ASAP.
And social media companies are doing all they can. Moderation is hard. It's impossible when you're serving millions of users at once. Nonetheless, the content goes down. Some of it is actual "terrorist content." Some of it is journalism. Some of it is stuff no one would consider terroristic. But it all goes down because time is of the essence and the world is watching.
But to what end? As was noted here all the way back in 2017, efforts made to take down "terrorist content" resulted in the removal of evidence of war crimes. Not much has changed since then. This unfortunate side effect was spotted again in 2019. Target all the terrorist content you want, but destroying it destroys evidence that could be used to identify, track, and, ultimately, prosecute terrorists.
Sure, there's some concern that unmoderated terrorist content contains the inherent power to radicalize internet randos. It's a valid concern but it might be outweighed by the positives of keeping the content live. To go further, it might be a net gain for society if terrorist content was accessible and easily-shared. This seems counterintuitive, but there's a growing body of research showing terrorists + internet use = thwarted terrorist plots.
Call me crazy, but this sounds like a better deal for the world's population than dozens of surveillance agencies slurping up everything that isn't nailed down by statute. This comes from Joe Whittaker at Lawfare, who summarizes research suggesting swift removal of "terrorist content" isn't helping win the War on Terror.
In my sample, the success of an attempted terrorist event—defined as conducting an attack (regardless of fatalities), traveling to the caliphate, or materially supporting others actor by providing funds or otherwise assisting their event—is negatively correlated with a range of different internet behaviors, including interacting with co-ideologues and planning their eventual activity. Furthermore, those who used the internet were also significantly more likely to be known to the security services prior to their event or arrest. There is support for this within the literature; researchers at START found that U.S.-based extremists who were active on social media had lower chances of success than those who were not. Similarly, research on U.K.-based lone actors by Paul Gill and Emily Corner found that individuals who used the internet to plan their actions were significantly less likely to kill or injure a target. Despite the operational affordances that the internet can offer, terrorist actors often inadvertently telegraph their intentions to law enforcement. Take Heather Coffman, whose Facebook profile picture of an image of armed men with the text “VIRTUES OF THE MUJIHADEEN” alerted the FBI, which deployed an undercover agent and eventually led to her arrest.
Correlation isn't causation but there's something to be said about visibility. This has been a noticeable problem ever since some law enforcement-adjacent grandstanders started nailing every online service with personal ads to the judicial wall for supposedly facilitating sex trafficking. Ads were pulled. Services were halted. And sex traffickers became increasingly difficult to track down.
As this research notes, radicalization might occur faster with heavier social media use. But this isn't necessarily a bad thing. Greater visibility means easier tracking and better prevention.
Out in the open also means encryption isn't nearly as much of an issue. Terrorist organizations appear to be voluntarily moving away from open platforms, sacrificing expeditious radicalization for privacy and security. But even that doesn't appear to pose nearly as much of a problem as politicians and law enforcement officials suggest.
When looking at the Islamic State cohort in the United States, unlike other online behaviors, there is not a significant relationship between the use of end-to-end encryption and event success. Terrorists who used it were just as likely to be successful as those who did not.
Unfortunately, there are no easy answers here. While driving terrorists underground results in limited visibility for those seeking to thwart their plans, allowing them to take full advantage of open platforms increases the number of possible terrorists law enforcement must keep an eye on.
The downsides of aggressive moderation, however, are clear. Visibility decreases as the possibility for over-moderation increases. Evidence needed for investigations and prosecutions vanishes into the ether over the deafening roar of calls to "do more."
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: content moderation, content removals, open source intelligence, terrorism, terrorist content
Reader Comments
Subscribe: RSS
View by: Time | Thread
So this mean that there is some evidence that freedom of speech, even of horrible monsters, may infact be a good thing?
[ link to this | view in chronology ]
Re:
And it only took us 150+ years to relearn that lesson. Go us.
[ link to this | view in chronology ]
but it's a good way for a certain few to make out that it does and thus enhance their chances of re-election!
typical politicians! when their lips are moving, you know the fuckers are lying!!
[ link to this | view in chronology ]
But how does visibility affect recruitment?
So visibility leads to more captures (less successes). But does visibility lead to more recruitment?
Greater visibility is better only if the capture rate is higher than the recruitment rate?
[ link to this | view in chronology ]
Re: But how does visibility affect recruitment?
I think you're looking at it wrong. Comparing how many join to how many killed is a fool's choice. I don't care if half the nation joins if it means that deaths decrease. Let them yammer all they want - it just means they're more likely to be caught. If you drive them underground, you may not have to listen to as many assholes, but you stand a better chance of getting killed by those same fewer assholes.
[ link to this | view in chronology ]
knowledge is greater than ignorance
The whole idea of removing this content and other undesirable content from the internet was never only about removing the content to protect the children or childish adults. It was about feeling morally superior while putting your head in the sand.
If you can't see something is a problem or if you can hide it from the majority of people then effectively the problem doesn't exist. You can ignore it till a later time or if you are lucky pass the problem off to someone else. Which in turn can make you or those you represent (or are in charge of) look better.
But the end result is your own people become more susceptible to manipulation and oppression because people never learned how to handle a situation where information was presented that was different than their own preconceived notions. Basically your population never grew up and never dealt with the real world.
If you dont even talk about the real world you can't learn from others or your own past. In engineering, people study the failures of the past so that in the future they don't repeat the mistake. If suddenly the knowledge of how past designs failed was gone or locked up the industry would constantly fail going forward because you can't learn when you don't have access to knowledge.
If people learn about why someone becomes a terrorist then they can also take steps to address the cause or at least recognize similar patterns in the future.
I just wish repeating mistakes and burying our heads was the only consequence of hiding "bad" content.
Unfortunately it is also something that powerful people with evil intent have found useful because it helps them retain power. That's why one of the first things those people do is try to control the narrative. If a population is ignorant they can't organize. They can't resist the slow destruction of their world or way of life.
The only thing that truly protects people is knowledge. Guns and armies can help but if you don't know who your real enemy is, you can't confront them. And so those in power continue to exploit the populous for their own gain.
Which is why the solution to problems is never hide the knowledge or facts. It is education and transparency.
[ link to this | view in chronology ]
Perhaps, but never underestimate the power of stupid people in large groups. On an individual basis knowledge > ignorance. On a nationwide scale stupid is winning.
[ link to this | view in chronology ]
For Such..
For Such an open minded country, we seem to be doing Allot to restrict OTHERS ideals..
Hiding reality under a bush, does not get rid of the Problem, it just makes it harder to find and FIX.
Have we ever Left the McCarthy era rhetoric.. NOT really. We like to Blame and point fingers and declare ....OTHERS did it, without proof or any logic in the comment..
I love suggesting how BAD Christians USED to be(and some still are) and watching them Cringe at the thought and demand that I am wrong.. then ask then how old the NEW testament is, and THEY DONT KNOW..
[ link to this | view in chronology ]
A Perpetual War isn't even meant to be won, ever.
[ link to this | view in chronology ]
Re:
Depends on how you define 'winning'. If your goal is to endlessly funnel money to individuals/industries and/or whip up fear to secure power that would otherwise be denied to you by pesky things like 'constitutional rights' then perpetual 'war' can mean perpetual winning.
[ link to this | view in chronology ]
'If we can't see it, it's not a problem.'
Constantly brushing atrocities under the rug doesn't do squat to actually prevent them or punish the perpetrators, but it's great for PR purposes that people and agencies can point to to show that they are Doing Something.
Even better, if you shift the blame to the platforms hosting said atrocities and any slip through you can simply blame them for any failures to catch everything, while claiming all the 'success' from forcing them to play whack-a-nutjob in the first place.
[ link to this | view in chronology ]
re: ADL /FBI manufactured terror
re:unmoderated terrorist content contains the inherent power to radicalize internet randos
Most, if not ALL of those “randos” had EXTENSIVE pre -incident contact online with police types, deplatformer types, and private security contractors working for or with the various mystery money NGOs /ADL /FBI /Etalphabets, and any of a nunber of LEIUs, and Association of Threat Assessment Professionals(ATAP ).
These cyber -stalking come offline, after these agents and agencies deploy psyops on their victims.
ATAP calls these constant, neverending fake investigations /personalized harassment campaigns "the colliding parallel investigation. "
[ link to this | view in chronology ]
Where did the peanut gallery go? I expected a couple hundred comments regarding "cesspool" "nazi lover" "social media companies can't be forced to host content they disagree with" "make your own platform"
Instead it is crickets and:
"...they're leaning hard on social media platforms to eradicate this content ASAP."
with which they have no problem with. Good to know. Shows what true bigot hypocrites they really are
[ link to this | view in chronology ]