UK Parliament Takes First Step Towards Making Google & Facebook Censor Everything
from the this-is-a-bad-idea dept
Look, let's just start with the basics: there are some bad people out there. Even if the majority of people are nice and well-meaning, there are always going to be some people who are not. And sometimes, those people are going to use the internet. Given that as a starting point, at the very least, you'd think we could deal with that calmly and rationally, and recognize that maybe we shouldn't blame the tools for the fact that some not very nice people happen to use them. Unfortunately, it appears to be asking a lot these days to expect our politicians to do this. Instead, they (and many others) rush out immediately to point the fingers of blame for the fact that these "not nice" people exist, and rather than point the finger of blame at the not nice people, they point at... the internet services they use.
The latest example of this is the UK Parliament that has released a report on "hate crime" that effectively blames internet companies and suggests they should be fined because not nice people use them. Seriously. From the report:
Here in the UK we have easily found repeated examples of social media companies failing to remove illegal content when asked to do so—including dangerous terrorist recruitment material, promotion of sexual abuse of children and incitement to racial hatred. The biggest companies have been repeatedly urged by Governments, police forces, community leaders and the public, to clean up their act, and to respond quickly and proactively to identify and remove illegal content. They have repeatedly failed to do so. That should not be accepted any longer. Social media is too important to everyone—to communities, individuals, the economy and public life—to continue with such a lax approach to dangerous content that can wreck lives. And the major social media companies are big enough, rich enough and clever enough to sort this problem out—as they have proved they can do in relation to advertising or copyright. It is shameful that they have failed to use the same ingenuity to protect public safety and abide by the law as they have to protect their own income.
Social media companies currently face almost no penalties for failing to remove illegal content. There are too many examples of social media companies being made aware of illegal material yet failing to remove it, or to do so in a timely way. We recommend that the Government consult on a system of escalating sanctions to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe.
This is the kind of thing that sounds good to people who (a) don't understand how these things actually work and (b) don't spend any time thinking through the consequences of such actions.
First off, it's easy for politicians and others to sit there and assume that "bad" content is obviously bad. The problem here is twofold: first, there is so much content showing up that spotting the "bad" stuff is not nearly as easy as people assume, and second, because there's so much content, it's often difficult to understand the context enough to recognize if something is truly "bad." People who think this stuff is obvious or easy are ignorant. They may be well-meaning, but they're ignorant.
So, for example, they say that these are cases where such content has been "reported" on the assumption that this means the companies must now "know" that the content is bad and they should remove it. The reality is much more difficult. DO they recognize how many such reports these companies receive? Do they realize that before companies start taking down content willy nilly, that they have to actually understand what's going on? Do they realize that it's not so easy to figure out what's really happening sometimes?
Let's go through the examples given: "dangerous terrorist recruitment material." Okay, seems obvious. But how do you distinguish terrorist recruitment videos from documenting terrorist atrocities? It's not as easy as you might think. Remember how a video of a European Parliament debate on anti-torture was taken down because the system or a reviewer thought it was promoting terrorism? People think this stuff is black and white, but it's not. It's all gray. And the shades of gray are very difficult to distinguish. And the shades of gray may differ greatly from one person to another.
Sexual abuse of children. Yes, clearly horrible. Clearly things need to be done. There are, already, systems for government-associated organizations and social media platforms to share hashes of photos deemed to be problematic and these are blocked. But, again, edge cases are tricky. Remember, it wasn't that long ago that Facebook got mocked for taking down the famed Napalm Girl photo? Here's a situation that seems black and white: no naked children. Seems reasonable. Except... this naked child is an iconic photo that demonstrates the horrors of war. That doesn't mean we should let all pictures of naked children online -- far from it, obviously. But the point is that it's not always so black and white, and any policy proposal that assumes it is (as the UK Parliament seems to be suggesting) has no idea what a mess it's causing.
Next on the list: "incitement to racial hatred." This would be so called "hate speech." But, as we've noted time and time again, this kind of thinking always ends up turning into authoritarian abuse. Over and over again we see governments punish people they don't like, by claiming what they're saying is "hate speech." But, you say, "incitement to racial hatred" is clearly over the line. And, sure, I agree. But be careful who gets to define both "incitement" and "racial hatred." You might not be so happy. Here in the US, there are people who (ridiculously, in my opinion) argue that groups like Black Lives Matter are a form of "incitement to racial hatred." Now, you might think that's crazy, but there are lots of people who disagree with you. And some of them are in power. Now are you happy about handing them the tools to demand that all social media sites take down their content or face fines? Or, how do you expect Google and Facebook to instantly determine if a video is a clip from a Hollywood movie, rather than "incitement to racial hatred?" There are plenty of powerful scenes in movies that none of us would consider "polite speech," but we don't think they should be taken down as "incitement to racial hatred."
Then the report notes that "the major social media companies are big enough, rich enough and clever enough to sort this problem out." First off, that's not true. As noted above, companies make mistakes about this stuff all the time. They take down stuff that should be left up. They leave up stuff that people think they should take down. You have no idea how many times each and every day these companies have to make these decisions. Sometimes they get it right. Sometimes they don't. Punishing them for a mistake in being too slow is a near guarantee that they'll be taking down a ton of legit stuff, just to avoid punishment.
Separately, who decides who's a "major social media company" that has to do this? If rules are passed saying social media companies have to block this stuff, congrats, you've just guaranteed that Facebook and Google/YouTube are the last such companies. No new entrant will be able to take on the burden/liability of censoring all content. If you try and somehow, magically, carve out "major" social media companies, how do you set those boundaries, without creating massive unintended consequences?
The report falsely claims that these companies have successfully created filters that can deal with advertising and copyright, which is laughable and, once again, ignorant. The ad filter systems on these platforms are terrible. We use Google ads for some of our ad serving, and on a near constant basis we're weeding out terrible ads, because no company is able to and awful people are getting their ads into the system all the time. And copyright? Really? If that's the case, why are the RIAA/MPAA still whining about Google daily? These things are much harder than people think, and it's quite clear that whoever prepared this report has no clue and hasn't spoken to anyone who understands this stuff.
Social media companies currently face almost no penalties for failing to remove illegal content.
What a load of hogwash. They face tremendous "penalties" in the form of public anger. Whenever these stories come out, the companies in question talk about how much more they need to do, and how many people they're hiring to help and all that. They wouldn't be doing that if there were "no penalties." The "penalties" don't need to be legal or fines. It's much more powerful when the actual users of the services make it clear what they don't like and won't stand for. Adding an additional legal threat doesn't change or help with that. It just leads to more problems.
And that's just looking at two awful paragraphs. There's much more like that. As Alec Muffett points out, the report has some really crazy ideas, like saying that the services need to block "probably illegal content" that has "similar names" to illegal content:
Despite us consistently reporting the presence of videos promoting National Action, a proscribed far-right group, examples of this material can still be found simply by searching for the name of that organisation. So too can similar videos with different names. As well as probably being illegal, we regard it as completely irresponsible and indefensible.
So, not only do the authors of this report want Google to remove any video that is reported, no questions asked (despite a long history of such systems being widely abused), it wants them to magically find all "similar" content that is "probably illegal" even with "different names." Do they have any idea what they're asking for? And immediately after that they, again, insist that this must be possible because of copyright filters. Of course, these would be the same copyright filters that tried to take down Cory Doctorow's book Homeland because it had a "similar name" to the Fox TV show "Homeland." "Similar names" is a horrific way to build a censorship system. It will not work.
What's so frustrating about this kind of nonsense is that it keeps popping up again and again, often from people with real power, in large part because they simply do not comprehend the actual result of what they're saying or the nature of the actual problem. There are not nice people doing not nice things online. We can all agree (hopefully) that we don't like these not nice people and especially don't like the not nice things they do online. But to assume that the answer to that is to blame the platforms they use for not censoring them fast enough misses the point completely. It will create tremendous collateral damage for tons of people, often including the most vulnerable, while doing absolutely nothing to deal with the not nice people and the not nice things they are doing.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: censorship, europe, hate crime, hate speech, intermediary liability, parliament, platforms, uk
Companies: facebook, google, twitter, youtube
Reader Comments
The First Word
“Humans learn by pain. Expect such idiocy to be implemented to some degree and politicians be left wondering what to do when it fails.
made the First Word by OldMugwump
Subscribe: RSS
View by: Time | Thread
Humans learn by pain. Expect such idiocy to be implemented to some degree and politicians be left wondering what to do when it fails.
[ link to this | view in chronology ]
Re:
The similarity rule would cause the British government to censor everything their government does.
[ link to this | view in chronology ]
Re:
But your second paragraph, humans learn by pain? What? Are you serious? That is very dumb and careless to say. You are basically doing the brainwashing work for tyrants.
In your world, you will have to kick the crap out of your kid so he/she learns, right? What a stupid comment.
Sure, in your world there is nothing as actually CONVINCING people and TEACHING people by means of education, information, data, understanding, empathy.
Sure, that is why innocent people had to be nuclear bombed in Hiroshima and Nagasaki right, so they learn a lesson!
Speak for yourself, but don't project your stupidity on others.
Sure, the first man on the moon got there by being whiplashed and punished all day. Nothing to do with training, data, technology, etc. You are a fool. And a fool with draconian views.
[ link to this | view in chronology ]
drop out
[ link to this | view in chronology ]
Re: drop out
[ link to this | view in chronology ]
Re: Re: drop out
[ link to this | view in chronology ]
Re: drop out
[ link to this | view in chronology ]
A bit off topic but I have a question.
[ link to this | view in chronology ]
Re: A bit off topic but I have a question.
[ link to this | view in chronology ]
Re: Re: A bit off topic but I have a question.
[ link to this | view in chronology ]
Re: A bit off topic but I have a question.
Your site could be censored if you publish hate speech such as a political opinion that does not favor the right people, or facts contrary to the interests of politicians or their friends.
[ link to this | view in chronology ]
Re: Re: A bit off topic but I have a question.
do you have any proof
[ link to this | view in chronology ]
Google and Facebook ARE major "social media" as everyone but Masnicks know.
The key fact is that those companies refuse to take down content EVEN WHEN INFORMED. What's clearly outside common law can't be allowed to operate openly. We're NOT better off when, say, drug dealers and prostitutes advertise openly. We're not better off because murderers and rapists how have a worldwide "platform". That's just silly libertarianism, also known as childish nihilism. You're past 40 now. Grow up.
You manifestly want corporate giants to operate without any responsibility to the public. -- But you don't object over their chosen target of censorship, such as withdrawing all advertising from Infowars. Clearly, there's a category of speech that you won't defend.
Your schtick of wailing about slippery slope imminent is a constant among soi-disant elites for at least a century now, and yet society clearly prospers when SOME measures are taken against simply "going too far". If you won't stop where we can clearly see the abyss, when will you?
[ link to this | view in chronology ]
Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.
Source?
[ link to this | view in chronology ]
Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.
I must not be reading the same publications that you do. Many requests are to remove damaging info about the person making the request, be it a direct request or via a third party business. Also many requests are for the removal of infringing art (photos, painting, music, books) and the requester thinks they have the copyright ... or at least they are supposed to - many do not but make the request anyway. How do these requests against drug dealers, prostitutes, murderers and rapists rank compared to the two above categories?
[ link to this | view in chronology ]
Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.
And your proof of this is... what, exactly?
> We're not better off because murderers and rapists how have a worldwide "platform".
And your proof of this is... what, exactly?
> If you won't stop where we can clearly see the abyss
And your proof that there is an abyss, that "we can clearly see", is... what, exactly?
[ link to this | view in chronology ]
Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.
Citation please.
"We're NOT better off when, say, drug dealers and prostitutes advertise openly."
I disagree. If someone is advertising an illegal service openly, then the open advertisement is clearly visible to the police and they can investigate and prosecute far more easily then when those services are hidden. Stopping the open advertising would make police investigation more difficult, it would not stop the illegal activity.
"You manifestly want corporate giants to operate without any responsibility to the public."
Only if you lie about what's being written.
"But you don't object over their chosen target of censorship, such as withdrawing all advertising from Infowars"
That's not censorship. The advertisers have as much ability to exercise free speech as Infowars does to spread their bullshit, and that includes deciding not to advertise on that site. Infowars are free to fund their site in ways other than 3rd party advertising, the advertisers cannot be compelled to advertise on their site.
[ link to this | view in chronology ]
Re: Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.
[ link to this | view in chronology ]
Google and Facebook ARE major "social media" as everyone but Masnicks know.
The key fact is that those companies refuse to take down content EVEN WHEN INFORMED. What's clearly outside common law can't be allowed to operate openly. We're NOT better off when, say, drug dealers and prostitutes advertise openly. We're not better off because murderers and rapists how have a worldwide "platform". That's just silly libertarianism, also known as childish nihilism. You're past 40 now. Grow up.
You manifestly want corporate giants to operate without any responsibility to the public. -- But you don't object over their chosen target of censorship, such as withdrawing all advertising from Infowars. Clearly, there's a category of speech that you won't defend.
Your schtick of wailing about slippery slope imminent is a constant among soi-disant elites for at least a century now, and yet society clearly prospers when SOME measures are taken against simply "going too far". If you won't stop where we can clearly see the abyss, when will you?
[ link to this | view in chronology ]
Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.
[ link to this | view in chronology ]
Re: Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.
if not go ask Mike Masnick and all those Techdirt articles about the abuses done to the market (high prices, bad service, monopoly, etc) by Comcast and the pack.
So, government sucks ballls, but corporations too. And more often than we know, they collude, for profit and money, against people and their rights.
And that is precisely the problem. If you think corporations are nay better than any government you are blind.
[ link to this | view in chronology ]
Re: Google and Facebook ARE major "social media" as everyone but Masnicks know.
[ link to this | view in chronology ]
By the way, appears that Masnick so believes in free speech that he's again pre-approving all comments.
(This is my second comment topic, after two just prior attempts.)
[ link to this | view in chronology ]
Re: By the way, appears that Masnick so believes in free speech that he's again pre-approving all comments.
out_of_the_blue hates it when due process is enforced. And logic.
[ link to this | view in chronology ]
Re: By the way, appears that Masnick so believes in free speech that he's again pre-approving all comments.
[ link to this | view in chronology ]
Re: By the way, appears that Masnick so believes in free speech that he's again pre-approving all comments.
[ link to this | view in chronology ]
Re: By the way, appears that Masnick so believes in free speech that he's again pre-approving all comments.
[ link to this | view in chronology ]
Re: By the way, appears that Masnick so believes in free speech that he's again pre-approving all comments.
[ link to this | view in chronology ]
Blame the tools.
I remember...
[ link to this | view in chronology ]
Re: Blame the tools.
Are you sure that information doesn't have the right to be forgotten in the UK?
[ link to this | view in chronology ]
Re: Re: Blame the tools.
[ link to this | view in chronology ]
Re: Blame the tools.
Yours truly,
Your USA government and junior aka UK government
"For your safety"
[ link to this | view in chronology ]
[ link to this | view in chronology ]
I wish they would start using the appropriate reasons for why they want something. For example: this needs to be changed because cows can't see into the earth's core, my fingers aren't circles and those letters are arranged in such a way that when I change them to numbers it still does not make any sense.
[ link to this | view in chronology ]
Re:
Some idiots will never learn, or simply they don't want to or simply they act a fool...for a buck.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Shifting the blame...
[ link to this | view in chronology ]
Re: Shifting the blame...
Poorly educated people might not articulate their ideas in a way that moves people to action.
[ link to this | view in chronology ]
Re: Re: Shifting the blame...
[ link to this | view in chronology ]
Re: Re: Shifting the blame...
[ link to this | view in chronology ]
Re: Re: Shifting the blame...
wealth = education
power = education
money = education
information = education
prestigious school = education
sweet speaking = education
[ link to this | view in chronology ]
Re: Shifting the blame...
Poorly educated people might not articulate their ideas in a way that moves people to action.
[ link to this | view in chronology ]
Re: Re: Shifting the blame...
People with more information create more censorship. FTFY
Education has nothing to do with schools, information or else. EDUCATION comes from home, from family values, personal values, kindness to others, etc. Not prestigious school, ivy league bullshit and all that crap, that would be just information or prestige but NOT education.
Schools DON"T educate people. Family educate people.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re:
They will try, they can try...but they will NEVER be able to fully control the internet, never.
[ link to this | view in chronology ]
They should take them to court
[ link to this | view in chronology ]
Kind of like saying "We have put a man on the moon so why don't we put a man on Pluto?"
Sure, if you don't know much about space this seems perfectly reasonable. We built a rocket and lander and sent people to the moon. Pluto is just another rock out there so shouldn't be that different. Then once you talk to someone at NASA you quickly find out it is a VERY different problem.
[ link to this | view in chronology ]
Re:
"Its all terrorism"
"All muslim are evil"....."Islam is evil"
"Drugs are bad"
"Restore peace"
"All immigrants are bad"
etc etc etc
[ link to this | view in chronology ]
Its a non issue for speech
Some look at this as speech. Except the speech isn't generated by these companies, it's externally generated based on rules these companies have created for their userbase and have been ineffective at monitoring or censoring per their own terms of service.
Some look at this as influence peddling (election meddling) aka Facebook or Google not censoring known false articles or criminal activities via the use of their tools. This again is on the shoulders of both Facebook and Google to monitor and block content that violates their own terms of service which have been ineffective.
Some look at this as individuals rights to speech including forms of art or commentary that a majority may find inappropriate content; While some get off on seeing the purge played out live using the tools Facebook and Google provide.
Facebook and Google could shutdown tomorrow and no one's speech will stop. The methods for that speech will change but speech isn't defined by the tools eg an artist using a paintbrush doesn't stop being an artist because they switched to felt tipped markers. The form may change, the but the art, the speech remains.
Look Google and Facebook break the rules, change their terms of service to benefit shareholders and have done so with little regard to their impact on individuals. Individuals are the product and as long as people keep using these tools they will have little say it how the tools use them.
Governments in my opinion are doing their jobs and struggling to come up with solutions to problems these companies are both creating/enabling while also ignoring to further generate click revenue.
Governments don't always get it right (laugh) or take a long time to get it right (laugh) and in this case could be over reacting but in their attempts ARE putting pressure on these companies to do more about owning the tools they provide and used by individuals and business and the content stored on their tools. That's called regulation, a function of governments around the world.
I for one am not worried about infringements of speech. Speech will continue in all it's forms, perhaps with less reliance or dependence on either of these companies - which in my view would be a good thing for speech... by that I mean less click bait, less forwarding of someone elses viewpoints and perhaps more real life in person conversations which are much healthier for our species.
I refuse to fight on behalf of these multi billion dollar entities just because they have cool tools. Their lobbies have already effected legislation in multiple countries that benefit them, these aren't charities or benevolent citizens. Paraphrasing Warren Buffet these are money generating machines.
Their management created these problems by ignoring reality that making money isn't the only responsibility these mega corporations have. Time for them to face the fire and grow as entities or die out one country at a time.
[ link to this | view in chronology ]
Re: Its a non issue for speech
Then kindly STFU.
thanx
[ link to this | view in chronology ]
Re: Its a non issue for speech
On the other hand i disagree with you, governments are not doing their job, they are just hungry for more power. They are, as the article says, criminalizing the tool and not the INDIVIDUALS that make this nefarious "expressions".
You contradict yourself, Google or Facebook are not enabling anything, because as you said, the art/the speech was already there, be it hate speech, CP, or whatever.
So all in all, corporations don't care, they are all about money, governments don't care, they are all about power and control (with little understanding of technology).....the solution will NEVER come from government nor from corporation. The only real solution is Organized Civil Society, forming Non Government Organizations non-profit that their primary goal is human rights, civil liberties, etc, not elections, not money.
[ link to this | view in chronology ]
This is so ________
I cant say anymore, its just ________.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Black and White
[ link to this | view in chronology ]
https://www.theguardian.com/technology/2015/dec/08/googles-eric-schmidt-spell-checkers-hate -harassment-terrorism
https://www.nytimes.com/2016/11/22/technology/facebook-censorship-tool-china.ht ml
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
- a generation of breast-feeding mothers
- several generations of breast cancer victims and survivors
- their families
- their friends
...not to mention anybody cooking fowl and trying to avoid the dark meat.
Porn producers LOVE to use punny names based on popular media or literature. I'm dying to see how the "similar names" rule works with that.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
ONE MORE ANALOGY
In particular the USA has failed to remove crime from streets, police killing innocent people, racism, elitism, corruption, etc etc etc.
In the UK we just need to talk about those horrendous shoe boxes where they force people to live. That is just a crime.
[ link to this | view in chronology ]