Flip Side To 'Stopping' Terrorist Content Online: Facebook Is Deleting Evidence Of War Crimes
from the not-for-the-first-time dept
Just last week, we talked about the new Christchurch Call, and how a bunch of governments and social media companies have made some vague agreements to try to limit and take down "extremist" content. As we pointed out last week, however, there appeared to be little to no exploration by those involved in how such a program might backfire and hide content that is otherwise important.
We've been making this point for many, many years, but every time people freak out about "terrorist content" on social media sites and demand that it gets deleted, what really ends up happening is that evidence of war crimes gets deleted as well. This is not an "accident" or such systems misapplied, this is the simple fact that terrorist propaganda often is important evidence of war crimes. It's things like this that make the idea of the EU's upcoming Terrorist Content Regulation so destructive. You can't demand that terrorist propaganda get taken down without also removing important historical evidence.
It appears that more and more people are finally starting to come to grips with this. The Atlantic recently had an article bemoaning the fact that tech companies are deleting evidence of war crimes, highlighting how such videos have actually been really useful in tracking down terrorists, so long as people can watch them before they get deleted.
In July 2017, a video capturing the execution of 18 people appeared on Facebook. The clip opened with a half-dozen armed men presiding over several rows of detainees. Dressed in bright-orange jumpsuits and black hoods, the captives knelt in the gravel, hands tied behind their back. They never saw what was coming. The gunmen raised their weapons and fired, and the first row of victims crumpled to the earth. The executioners repeated this act four times, following the orders of a confident young man dressed in a black cap and camouflage trousers. If you slowed the video down frame by frame, you could see that his black T-shirt bore the logo of the Al-Saiqa Brigade, an elite unit of the Libyan National Army. That was clue No. 1: This happened in Libya.
Facebook took down the bloody video, whose source has yet to be conclusively determined, shortly after it surfaced. But it existed online long enough for copies to spread to other social-networking sites. Independently, human-rights activists, prosecutors, and other internet users in multiple countries scoured the clip for clues and soon established that the killings had occurred on the outskirts of Benghazi. The ringleader, these investigators concluded, was Mahmoud Mustafa Busayf al-Werfalli, an Al-Saiqa commander. Within a month, the International Criminal Court had charged Werfalli with the murder of 33 people in seven separate incidents—from June 2016 to the July 2017 killings that landed on Facebook. In the ICC arrest warrant, prosecutors relied heavily on digital evidence collected from social-media sites.
The article notes, accurately, that this whole situation is kind of a mess. Governments (and some others in the media and elsewhere) are out there screaming about "terrorist content" online, but pushing companies to take it all down is having the secondary impact of both deleting that evidence from existence and making it that much more difficult to find those terrorists.n And when people raise this concern, they're mostly being ignored:
These concerns are being drowned out by a counterargument, this one from governments, that tech companies should clamp down harder. Authoritarian countries routinely impose social-media blackouts during national crises, as Sri Lanka did after the Easter-morning terror bombings and as Venezuela did during the May 1 uprising. But politicians in healthy democracies are pressing social networks for round-the-clock controls in an effort to protect impressionable minds from violent content that could radicalize them. If these platforms fail to comply, they could face hefty fines and even jail time for their executives.
As the article notes, the companies rush to appease governments demanding such content get taken down has already made the job of those open source researchers much more difficult, and actually helped to hide more terrorists:
Khatib, at the Syrian Archive, said the rise of machine-learning algorithms has made his job far more difficult in recent months. But the push for more filters continues. (As a Brussels-based digital-rights lobbyist in a separate conversation deadpanned, “Filters are the new black, essentially.”) The EU’s online-terrorism bill, Khatib noted, sends the message that sweeping unsavory content under the rug is okay; the social-media platforms will see to it that nobody sees it. He fears the unintended consequences of such a law—that in cracking down on content that’s deemed off-limits in the West, it could have ripple effects that make life even harder for those residing in repressive societies, or worse, in war zones. Any further crackdown on what people can share online, he said, “would definitely be a gift for all authoritarian regimes. It would be a gift for Assad.”
Of course, this is no surprise. We see this in lots of contexts. For example, the focus on going after platforms for sex trafficking with FOSTA stopped the ability of police to help find actual traffickers and victims by hiding that material from view. Indeed, just this week, a guy was sentenced for sex trafficking a teenager, and the way he was found was via Backpage.
This is really the larger point we've been trying to make for the better part of two decades. Focusing on putting liability and control on the intermediary may seem like the "easiest" solution to the fact that there is "bad" content online, but it creates all sorts of downstream effects that we might not like at all. It's reasonable to say that we don't want terrorists to be able to easily recruit new individuals to their cause, but if that makes it harder to stop actual terrorism, shouldn't we be analyzing the trade-offs there? To date, that almost never happens. Instead, we get the form of a moral panic: this content is bad, therefore we need to stop this content, and the only way to do that is to make the platforms liable for it. That assumes -- often incorrectly -- a few different things, including the idea that magically disappearing the content makes the activity behind it go away. Instead, as this article notes, it often does the opposite and makes it more difficult for officials and law enforcement to track down those actually responsible.
It really is a question of whether or not we want to be able to address the underlying problem (those actually doing bad stuff) or sweep it under the rug by deleting it and pretending it doesn't happen. All of the efforts to put the liability on intermediaries really turns into an effort to sweep the bad stuff under the rug, to look the other way and pretend if we can't find it on a major platform, that it's not really happening.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: christchurch call, content moderation, evidence, extremist content, terrorist content, war crimes
Companies: facebook, google, twitter
Reader Comments
The First Word
“Re: What makes you think "flip side"? Facebook deletes Israeli c
So than I say that all movie studios release all of their movies via torrents! That is the final say on how they operate because the business is the The Public's markets!
made the First Word by Gary
Subscribe: RSS
View by: Time | Thread
Do you say that the politicians that complained yesterday about the Facebook "filter bubble" now want Facebook to filter more?
[ link to this | view in chronology ]
See also: Pornography, violent media, and overt bigotry.
[ link to this | view in chronology ]
Re:
Confirmed correct. Banning All The Things! doesn't make them go away, just underground. We need to address the attitudes behind the expression instead of sweeping them under the ban carpet.
[ link to this | view in chronology ]
What makes you think "flip side"? Facebook deletes Israeli crime
You're trying to flip this to divert from Facebook's EVERYDAY practices as if it's valiantly struggling to do good.
Facebook's Secret Censorship Manual Exposed as Platform Takes Down Video About Israel Terrorizing Palestinians
http://www.informationclearinghouse.info/50851.htm
`Facebook is a private company!' shout people in favor of censoring political content
https://www.rt.com/news/451872-facebook-private-company-censorship/
And what I state is that so long as a business is in The Public's markets, then WE have final say in how operates. -- Even if it hadn't agreed to a license.
Inside FACEBOOK Secret Rulebook for Global Political Speech...
https://www.msn.com/en-us/news/technology/inside-facebook*s-secret-rulebook-for-global-po litical-speech/ar-BBRvjsg
Now, WHO here is always for surveillance capitalism / corporate censorship and doesn't want these global-mega-corps to be regulated, let alone cut down to size? Masnick.
This is just more of Masnick's ongoing "left-libertarian" cover story as if he's for free speech and free markets. But like Facebook, what you DON'T see mentioned by Techdirt reveals its true agenda.
[ link to this | view in chronology ]
Re: What makes you think "flip side"? Facebook deletes Israeli c
[ link to this | view in chronology ]
Re: Now, WHO here is always for surveillance capitalism
Now, WHO here is always for surveillance capitalism
That is a very good question, thanks for asking!
I'd say it is the Pro-Copyright folks (Blueballs, John, others) that are always screaming about how they want corporations to monitor (and moderate!) all user activities.
Obviously they both want corporations to see everything we do to stop copyright violations. Surveillance Capitalism needs needs strong copyright law to justify the oversight.
[ link to this | view in chronology ]
Please cite the law, statute, or court ruling that says the public has “final say” in what speech what a privately owned platform open to UGC will and will not host.
[ link to this | view in chronology ]
Re: What makes you think "flip side"? Facebook deletes Israeli c
So than I say that all movie studios release all of their movies via torrents! That is the final say on how they operate because the business is the The Public's markets!
[ link to this | view in chronology ]
It seems a bit late to try and control the narrative but it looks like that is what the cult of conservative is up to. The cat is out of the bag, the horses have left the barn, pandora's box is open and it is too late to put the genie back in the bottle but they will continue to try because that is all they know apparently.
Any good coverup now has to include methods of scrubbing the internet of all instances of said story and, as many here know, this is near impossible.
Seems it is not the content that they dislike, it is the discussion of terrorism they do not like because of their involvement.
[ link to this | view in chronology ]
Re:
The only thing wrong with your comment is the word conservatives as EVERYBODY who does not like something on the internet is demanding exactly the same thing, what they personally dislike be removed and replacement with something they personally like.
[ link to this | view in chronology ]
Re: Re:
"EVERYBODY who does not like something on the internet is demanding exactly the same thing"
I doubt it ... I'm not.
Many complain about creeping fascism while others point out how the law is not equally enforced. These people are not demanding an internet fairness law nor are they demanding social media carry their message. There is a huge difference between what people want and how they communicate same.
I do not demand removal of the stupid crap people type/say/record/whatever, stupid crap should be given rebuttals not removal.
[ link to this | view in chronology ]
Including you? Or are you “the exception that proves the rule”?
[ link to this | view in chronology ]
See also: how FOSTA is designed to protect sex traffickers by obsucing evidence of their actions.
[ link to this | view in chronology ]
Someone needs to take that company and pull the plug on it.
[ link to this | view in chronology ]
Re:
Which company? The one that's been pressured by politicians to remove content that allows people to find criminals and do something about them, or some other company for... some reason?
[ link to this | view in chronology ]
Why Facebook has censorship.
I do not often read RT news for the simple reason I grew up in the cold-war but on occasions I do.
RT has an article which I have coped and pasted a portion below with a few modifications.
Hollywood actress ***** has just learned that there’s nothing so illiberal as a liberal when the received truths that underpin their worldview are challenged.
Her experience of being dog piled on social media for daring to echo concerns over the official narrative surrounding *****, conforms to an established pattern.
It dictates that anyone who dares question the official narrative on things such *****, is subjected to an evermore intense level of character assassination and calumniation.
Focusing first on forensic work undertaken by a group of dissident academics , has induced a state of hysteria within the mainstream media.
For their trouble members of the group have found themselves depicted as enemies of the people in (main stream media), their pictures and personal details published and pressure put on the universities which employ them to sack them. And all for daring to cast doubt on events such as .
(She) is not part of the , yet for daring to cite their work ***, she found herself being mauled on social media. She was no doubt unaware that in doing so she was guilty of giving succour to ‘Assadists,’ ‘conspiracy theorists,’ and that she’s ‘naive.’
This is really important. Why aren’t we talking about it?
"We may have just discovered a major piece of the puzzle explaining how seemingly independent international organizations help deceive us into consenting to wars and regime change interventionism around the world."
— May 17, 2019
Some subjects are such that anything posted has an immediate response from one group or the other with a barrage of hate so profound and intense that for all practical purposes the author would have been better off committing suicide.
For verification one can read the unredacted article at
https://www.rt.com/op-ed/459821-sarandon-media-opcw-syria/
[ link to this | view in chronology ]
…okay, then.
[ link to this | view in chronology ]
Re: Why Techdirt has Moderation
Hey blue balls. See that. That’s how you do crazy right. Not the brain drooling slowly out of your mouth like you practice.
[ link to this | view in chronology ]
Re: Why Facebook has censorship.
I do not often read RT news
Hey - and actual Russian Troll! And I thought trying to cite Infowars was bad. This shit is amazeballs.
A very stabile genius said that Putin is our bestie - so we'd better take heed of RT News.
[ link to this | view in chronology ]
'If I can't see it, it's not a problem.'
It really is a question of whether or not we want to be able to address the underlying problem (those actually doing bad stuff) or sweep it under the rug by deleting it and pretending it doesn't happen.
A question that the politicians involved have answered pretty clearly: Brush it under the rug.
Blaming the companies and forcing them to take the videos down makes the problems/war crimes less visible, so they look like they're Doing Something, whereas actually doing something about the problem/war crimes takes work, and would require having to explain to the easily triggered that yes, objectionable content is still up, because that's the best way to find who did it and stop them, and what politician wants to deal with that when it's easier to just blame the companies and take all the credit for themselves?
The goal of bills like that isn't to address the underlying problems, to stop the actual criminals and/or rescue the victims, it's merely to present the facade of Doing Something, to make it look like the politicians are addressing the issues because they're less visible when all they've really done is hand a huge gift to the criminals involved by making them harder to find.
[ link to this | view in chronology ]
Re: 'If I can't see it, it's not a problem.'
It was obvious when the problem of censorship vs discretion became visible was when the videos of those cursed terrorists where cutting off peoples' heads and the ceos of those platforms couldn't be trusted to use common sense decency and discretion to block and or remove immediately those acts of inhuman horror.
[ link to this | view in chronology ]
Re: Re: 'If I can't see it, it's not a problem.'
Just a tip for the future, but it helps if you actually read the article(all of it) before commenting, as it allows you to avoid such fumbles as 'advocating for something that would have prevented the person involved in the killings from being caught'.
The fact that the video was still up is what allowed people to find at least one of those responsible and charge them with the murders, something that would not have been possible had the video, as abhorrent as it was, been immediately taken down.
[ link to this | view in chronology ]
It's part of the new plan to sweep up extremist content;
Creating A Reliable Program to Exterminate Terrorism
[ link to this | view in chronology ]
Re:
i think you mean: Creating Reliable Antiterrorist Program
[ link to this | view in chronology ]
Re:
Creating Less Expensive Avenues Violating Everyone's Rights
CLEAVER
[ link to this | view in chronology ]
Along with its sister plan, the “Sympathizing With Extremism” Eradication Ploy.
[ link to this | view in chronology ]
Why free speech?
Censorship always has worse outcomes than unrestricted public discourse.
[ link to this | view in chronology ]
Responding to criminal acts with....criminal acts?
If you look closely at the Federal obstruction statutes, this sounds like some folks are "engaged in misleading conduct towards another person" (lying to Facebook about the effects and consequences of the takedown) "with intent to" (they want the takedown) "cause or induce any person" (Facebook and employees thereof) "to ... withhold a record, document, or other object" (the "terrorist content" they want gone) "from an official proceeding" (any attempt to actually catch and punish The Bad Guys™). (18 USC 1512(b)(2)(A))
While this doesn't apply directly to Congressmuppets (due to Speech or Debate Clause immunity), it sounds like something that the rest of the muppets who are suggesting "take down all the bad things!" need to be reminded of...
[ link to this | view in chronology ]
This is by design, because it's ever so much easier for governments to brush problems under the rug than solve them.
[ link to this | view in chronology ]