Be Cautious About Big Internet Platforms Bearing Plans For Global Censorship
from the let's-not-get-carried-away-here dept
In the wake of the Christchurch shooting massacre in New Zealand, there has been a somewhat odd focus on the internet platforms -- mainly those that ended up hosting copies of the killer's livestream of the attack. As we previously discussed, this is literally blaming the messenger, and taking away focus from the much deeper issues that led up to the attack. Still, in response, Microsoft's Brad Smith decided to step forward with a plan to coordinate among big internet companies a system for blocking and taking down such content.
Ultimately, we need to develop an industrywide approach that will be principled, comprehensive and effective. The best way to pursue this is to take new and concrete steps quickly in ways that build upon what already exists.
Smith points to an earlier agreement between YouTube, Facebook, Twitter and Microsoft to form GIFCT, the Global Internet Forum to Counter Terrorism, by which the various big platforms share hashes of content deemed "terrorist content" so they can all spot it across their platforms. Here, Smith suggests expanding that effort:
We need to take new steps to stop perpetrators from posting and sharing acts of violence against innocent people. New and more powerful technology tools can contribute even more than they have already. We must work across the industry to continue advancing existing technologies, like PhotoDNA, that identify and apply digital hashes (a kind of digital identifier) to known violent content. We must also continue to improve upon newer, AI-based technologies that can detect whether brand-new content may contain violence. These technologies can enable us more granularly to improve the ability to remove violent video content. For example, while robust hashing technologies allow automated tools to detect additional copies already flagged as violent, we need to further advance technology to better identify and catch edited versions of the same video.
We should also pursue new steps beyond the posting of content. For example, we should explore browser-based solutions – building on ideas like safe search – to block the accessing of such content at the point when people attempt to view and download it.
We should pursue all these steps with a community spirit that will share our learning and technology across the industry through open source and other collaborative mechanisms. This is the only way for the tech sector as a whole to do what will be required to be more effective.
Some of this may be reasonable, but we should be careful. As Emma Llanso neatly lays out in a series of tweets, before we expand the power and role of GIFCT, we should take care of many of the existing concerns with the program. Here's a (lightly edited) transcription of Llanso's concerns:
In Brad Smith's post on Microsoft's response to the New Zealand attacks, we see another example of a company promoting an expanded role for the GIFCT without addressing any of the long-standing transparency and accountability issues with the consortium. Smith makes several proposals to further centralize and coordinate of content-blocking by major tech companies and fails to include any real discussion of transparency, external accountability to users, or safeguards against censorship.
The closest he gets is describing a "joint virtual command center" of tech companies to coordinate during major events, which would enable tech companies to ensure they "avoid restricting communications that [tech companies unilaterally] decide are in the public interest". Public interest must be part of the analysis but media orgs & nations have come to different conclusions about how to cover the NZ attacks. It's naive to suggest that a consensus view of "public interest" could, much less ought, be set by a consortium of US-born tech companies.
There's also a chilling call to "explore browser-based solutions" to block people's ability to view or download content, with no recognition of how dangerous it is to push censorship deeper into infrastructure. "Safe-search" is user-controlled; would MSFT's terror-block be as well?
Smith is calling for discussion about how tech can/should be involved in responding to terrorism, which is reasonable. But any discussion that fails to include transparency and safeguards against censorship, from the very beginning, is irresponsible. I know that many people's instincts right now are focused on how to take more content down faster, but as Smith notes, "the public rightly expects companies to apply a higher standard." Takedown policies without safeguards are incomplete and are not "solutions".
Llanso makes a number of good points here, but a key one to me: while coordination and agreement to act together may sound like a good way to approach global scale issues that can move from platform to platform, it also suggests that there is only one solution to such content (which is to outright ban it across all platforms). That takes out of the equation more creative or alternative approaches. It also takes out context. As we've discussed before, in some cases someone's "terrorist content" is actually evidence of war crimes that it might be useful for someone to have.
Yes, lots of people are rightly concerned that videos and manifestos related to attacks may inspire copycat (or worse) attacks. But trying to stuff the entire thing down the memory hole in a single coordinated plan -- where the big internet platforms are the final arbiters of everything -- hardly seems like the right solution either. Indeed, taking such a position actually makes it that much harder for different platforms to experiment with different, and possibly more effective, ways of dealing with this kind of content.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: christchurch, free speech
Reader Comments
Subscribe: RSS
View by: Time | Thread
It is their ball, and if we don't play by their rules...
[ link to this | view in chronology ]
Well, to me it doesn't seem like we need to worry about this. Aftet all, if the EU gets their way there won't be an internet to host terrorist content.
/s
[ link to this | view in chronology ]
Ah good old witch-hunts...
It would seem that 'violent video games' are taking a backseat as 'the source of all corruption', with 'open internet platforms' taking their place as the scapegoat.
[ link to this | view in chronology ]
Re: Ah good old witch-hunts...
That goat died for our sings. We marked it to the edge of town, blamed it for all the sinning we've done this year, and sent it out banished to wander the desert until it dies.
We're sin free!
Stupid evil goat.
Anyone for some infidelity? I've got some new derivatives. Wraps up sub-prime medical bills with emerging cryptocurrencies.
[ link to this | view in chronology ]
I'm curious why this is filed under "Free Speech" if there is no government involvement. Using "free speech" to cover non-governmental censorship only perpetuates misconceptions among many of your readership.
[ link to this | view in chronology ]
'Nice platform you got there...'
Could be because a lot of the push is due to government pressure, 'step up, do the impossible, or we'll break out the regulations', and when the actions being discussed are motivated less by the companies themselves and more to avoid government regulation in a sense there is government involvement.
If there weren't politicians insisting that companies 'Nerd harder or else!', and they were just doing this stuff on their own then yeah, wouldn't really be a free speech issue in that way, but as there is it's not unfair to put it into that category.
[ link to this | view in chronology ]
Re: 'Nice platform you got there...'
Yeah it is worse as a chilling effect thana sa 1 even really shitty laws and worse for rule of law. If there were rules they could appeal or take action. Worse there is never a line for "enough" and it is essentially weaponizing corporate fear of uncertainty.
They have their own culpability of course for bending instead of calling them on their bullshit publicly and unrelentingly. What they should be doing is hitting the bully back, humiliating them and making them cry.
Press them - for details and tear their non-solutions apart along with the ugly implications of the policies and be downright mean. So how exactly will taking down content about people doing bad things bring the dead back to life?
If it is possible to detect terrorism this easily maybe your intelligence agencies should have developed the ability to do that already with their budget and mass privacy violations?
[ link to this | view in chronology ]
Re:
“No government involvement”
Hahahahaha your funny John.
Your a liar and no god will take you. But you are funny.
[ link to this | view in chronology ]
Re:
No, no, no. While perhaps not as closely tied to say, the CCP, these companies are tied to the hip with governments at this point. So, at the very least, we have to treat these corporate entities as proxies for governmental censorship.
It's like with copyright law being outdated for the internet age, so are privacy protections and free speech laws. In many ways, the big internet corp. is worse for free speech since they have more direct ability to do it THAN EVER BEFORE IN HUMAN HISTORY ---AND--- they totally lack any accountability.
[ link to this | view in chronology ]
Re: Re:
"more direct ability to CENSOR than ever before"
[ link to this | view in chronology ]
Re: Re:
They have also enables more people to publish their speech with a global reach than was ever possible before in history.
[ link to this | view in chronology ]
The best thing for the internet would be for all of these massive corporations to die off... but it would be sad to see Masnick lose out on valuable stenography clients.
[ link to this | view in chronology ]
Re:
Can you show us on the doll where the bad corporation touched you?
We get it - you were kicked off Facebook for being a creepy stalker and now you want them shut down for "discriminating against your incel viewpoint."
[ link to this | view in chronology ]
Re: Crybaby Johns in tears again
Ok bros everyone be super nice. Old John boy is feeling super fragile today.
[ link to this | view in chronology ]
Re:
The best thing for the internet would be for all of these massive corporations to die off
You work for one, Jhon. Unless you were lying, you probably don't want to bite the cock you feed from.
[ link to this | view in chronology ]
The New Zealand shooter live streams and puts out written material stating his intent is to create a spectacle so large that Western governments restrict freedoms. The hope that in restricting freedoms might lead to increased hostility from citizens unreasonably restricted afterward taking up the fight for those curtailed freedoms.
New Zealand government picks up that ball and runs forward with the objectives of the shooter.
Tech companies, not to be out-done, jump on board with ways that they can help restrict human rights because there was once a shooter.
I'm thinking of Charlie Manson and his similar helter skelter plan to try and kick off civil wars and wondering if governments and companies tripped all over themselves back then as well to give the murderer their every demand.
The lesson sent by the response from New Zealand and tech giants is that violence is rewarded. Need a law stating all knee-jerk responses are banned for a year after an event like this, so there is opportunity for discussion and avoid acting on emotion. Shameful response from politicians.
[ link to this | view in chronology ]
'Never let a good tragedy go to waste'
It rather reminds me of a quote I remember attributed to Bin Laden a number of years back, talking about how all he had to do was attack the US once and it would tear itself apart from the inside, and would you look at that, both then and now you've got politicians tripping over themselves to crack down on the public, handing the assholes everything they could have possibly dreamed of.
When a violent attack is guaranteed to result in instant national/global fame for those responsible, and the governments(local and up) losing their minds and cracking down on freedoms of the public, it's hardly a wonder you've got assholes pulling stunts like this, because whether they're looking for attention and/or panic they know it works.
Along those lines, while it obviously wouldn't solve everything, I can't help but wonder how much a global application of the Some Asshole Initiative would help. No fame, no deep dives into motivation/race/age, any shooter or idiot with a bomb is reduced to nothing more than 'Some asshole'. If nothing else, it would certainly improve the reporting on stuff like this by removing the fame granted to any asshole who shoots/blows up a bunch of people, and that alone would be an improvement.
[ link to this | view in chronology ]
Re: 'Never let a good tragedy go to waste'
"It rather reminds me of a quote I remember attributed to Bin Laden a number of years back, talking about how all he had to do was attack the US once and it would tear itself apart from the inside, and would you look at that, both then and now you've got politicians tripping over themselves to crack down on the public, handing the assholes everything they could have possibly dreamed of. "
Similar to how IS recruiters applauded Trump's victory, the same way Al-Quaeda cheered for GWB. They were counting on turning muslims in the US into a persecuted minority they could recruit from. with some success, apparently.
If you are an extremist the first thing you want is a bigger pool of dissatisfied potential recruits. And the best way to do that is to turn your 3rd world land grab into a global conflict of ideology.
What bugs me is that back when the terrorists were the PLO, the IRA, the Basques, Baader-Meinhof, etc...every western leader was adamant that we would never allow the terrorists to force us to change our society.
Now look at us. IS commited an atrocity and we fell all over ourselves trying to dig ourselves into a surveillance society. Some asshat murders a bunch of people in NZ and governments suddenly compete in changing society the way he probably intended.
And scared shitless over a spate of ambiguous potential censorship legislation, private corporations and news agencies fall in line out of fear and anxiety.
[ link to this | view in chronology ]
Re:
Just an excuse for this psychopath to go on a joyful, mindless killing spree and portray some lame sense of intellectual superiority at the same time.
[ link to this | view in chronology ]
Here's a question I don't think I've seen raised. Where does video of police shooting someone fall in all this? Those are videos of people being killed too. Might even be live-streamed, by bystanders or victims. Such videos have proven important in holding police accountable in a variety of ways such as by not letting police PR sweep officer mistakes and misbehavior under the rug with false accusations.
So are those considered "violent video content" to be aggressively removed?
[ link to this | view in chronology ]
Re:
It is a very good question indeed, and it is one that the people pushing such initiatives, as well as the governments that are pressuring them to do so, don't give a single fuck about attempting to answer in a coherent, rational manner.
[ link to this | view in chronology ]
Then we have violent video game and movie content that could be dinged despite it being digital blood being spilled by digital people. Everything from the Fatalities in Mortal Kombat 11 to the action scenes in John Wick could be declared “violent video content” and removed, despite the fact that such content shows fictional people being “killed”. I have to wonder how much of that content will be labeled as such — and how much companies will pay to “undo” that label because hey, even they need views on YouTube.
[ link to this | view in chronology ]
Re: Re:
Oh they do care....it's just that they care about hiding law enforcement misdeeds from the public. Police unions are going to love this. Video shows cop shooting unarmed suspect in the back? Terrorist content....it must be removed.
[ link to this | view in chronology ]
Re: Re: Re:
Terrorist wanting to hide their violence, do they want people to believe that they are much more violent than they really are, so as to get instant submission and compliance?
[ link to this | view in chronology ]
Re: Re: Re: aftermath
Some guy:So Officer now that a court has ruled that video of you shooting that unarmed guy in the back was NOT what you said it was how much do you think I should take from your departments budget and assets so that the personal harm you have done will not be repeated?😁
[ link to this | view in chronology ]
You're telling me? -- You're telling ME, Google-boy?
About Global Censorship? After I've been censored for years right here at Techdirt in part because railing at the dangers posed by global mega-corps, which were started for the specific purpose, are implementing it a little more every day, and visibly merging with gov't too?
No, Masnick. Not even your fanboys are going to believe that you're suddenly worried about this.
Not after your long history of advocating absolute arbitrary control by corporations:
"And, I think it's fairly important to state that these platforms have their own First Amendment rights, which allow them to deny service to anyone."
https://www.techdirt.com/articles/20170825/01300738081/nazis-internet-policing-content -free-speech.shtml
The only way to deal with the inevitable control by BIG entities is to piece them up. -- AND don't let globalism grow in first place!
Corporations are NOT persons, contrary to what YOU believe and advocate, do not even have a "Right" to exist at all: they first ask the The Public for permission to exist, and agree to operate under rules in The Public's marketplace. -- NOT politics: to offer products and services. -- They're to serve The Public and operate by Our rules for general societal benefits, not for narrow and solely financial interests.
But as always Masnick's implicit premise is that these mega-corporations must not be limited -- that might stifle "innovation". He totally ignores that the "innovation" nowadays is only of new ways to surveil and control. -- And Masnick NEVER calls for any action that would adversely affect their profits.
[ link to this | view in chronology ]
Re: Why you still here blue-boy?
[ link to this | view in chronology ]
And you never do anything but whine like a child about how mean ol’ Mikey hurt your fee-fees, so how about doing us all a favor and go suck on a pacifier for a few decades.
[ link to this | view in chronology ]
Re: You're telling me? -- You're telling ME, Google-boy?
Luckily enough for the rest of us, all your spittle ends up on your own screen and keyboard.
[ link to this | view in chronology ]
"Yes, lots of people are rightly concerned that videos and manifestos related to attacks may inspire copycat (or worse) attacks."
It is understandable that people in fear react in such ways, but the reality is that there is almost no evidence whatsoever that these things actually happen on any scale.
The correct response on the part of the tech companies, at least in America, is nothing. Freedom of expression means people are allowed to produce media that the majority of the populace finds distasteful.
In addition, suppressing the evidence of crime does not prevent the crime from happening again in the future, despite the continuous stream of lies from politicians in various countries, and in fact might make it harder to catch perpetrators.
[ link to this | view in chronology ]
I disagree with all of this. The video is just a video of a real life event. Reality is what is so bad here. The problem is reality, not videos. Ban reality.
(Oh and go f yourself!)
[ link to this | view in chronology ]
Censorship is wrong, no doubt about that. If you have been warned you have the option to tune out. I've seen much worse in Hollywood productions. I guess ratings will be the next thing. Christchurch is pornography, if you don't believe it check with Daniel Webster. That doesn't mean viewing it should be unlawful, just not my preference.
[ link to this | view in chronology ]
Is it like blaming the messenger?
To me, it's so absurd, it's more like blaming the piece of paper the message is written on!
[ link to this | view in chronology ]
The only difference between a live video of the event and the FUD propaganda spread by governments and mainstream media is that it's more likely to be true.
[ link to this | view in chronology ]