Content Moderation At Scale Is Impossible: The Case Of YouTube And 'Hacking' Videos
from the how-do-you-deal-with-this? dept
Last week there was a bit of an uproar about YouTube supposedly implementing a "new" policy that banned "hacking" videos on its platform. It came to light when Kody Kinzie from Hacker Interchange, tweeted about YouTube blocking an educational video he had made about launching fireworks via WiFi:
We made a video about launching fireworks over Wi-Fi for the 4th of July only to find out @YouTube gave us a strike because we teach about hacking, so we can't upload it.
YouTube now bans: "Instructional hacking and phishing: Showing users how to bypass secure computer systems"
— Kody (@KodyKinzie) July 2, 2019
Kinzie noted that YouTube's rules on "Harmful or dangerous content" now listed the following as an example of what kind of content not to post:
Instructional hacking and phishing: Showing users how to bypass secure computer systems or steal user credentials and personal data.
This resulted in some quite reasonable anger at what appeared to be a pretty dumb policy. Marcus "Malware Tech" Hutchins posted a detailed blog post on this change and why it was problematic, noting that it simply reinforces the misleading idea that all "hacking is bad."
Computer science/security professor J. Alex Halderman chimed in as well, to highlight how important it is for security experts to learn how attackers think and function:
I've taught college-level computer security at @UMich for 10 years, and the most important thing we teach our students is how attackers operate. YouTube's new policy will do nothing to stop bad guys, but it will definitely make it harder for the public to learn about security. https://t.co/1wvB63c5aB
— J. Alex Halderman (@jhalderm) July 3, 2019
Of course, some noted that while this change to YouTube's description of "dangerous content" appeared to date back to April, there were complaints about YouTube targeting "hacking" videos last year as well.
Eventually, YouTube responded to all of this and noted a few things: First, and most importantly, the removal of Kozie's videos was a mistake and the videos have been restored. Second, that this wasn't a "new" policy, but rather just the company adding some "examples" to existing policy.
This raises a few different points. While some will say that since this was just another moderation mistake and therefore it's a non-story, it actually is still an important point in highlighting the impossibility of content moderation at scale. You can certainly understand why someone might decide that videos that explain how to "bypass secure computer systems or steal user credentials and personal data" would be bad and potentially dangerous -- and you can understand the thinking that says "ban it." And, on top of that, you can see how a less sophisticated reviewer might not be able to carefully distinguish the difference between "bypassing secure computer systems" and some sort of fun hacking project like "launching fireworks over WiFi."
But it also demonstrates that there are different needs for different users -- and having a single, centralized organization making all the decisions about what's "good" and what's "bad," is inherently a problem. Going back to Hutchins' and Halderman's points above, even if the Kinzie video was taken down by mistake, and even if the policy is really supposed to be focused on nefarious hacking techniques, there is still value for security researchers and security professionals to be able to keep on top of what more nefarious hackers are up to.
This is not all that different than the debate over "terrorist content" online -- where many are demanding that it be taken down immediately. And, conceptually, you can understand why. But when we look at the actual impact of that decision, we find that removing such content appears to make it harder to stop actual terrorist activity, because it's now harder to track and to stop.
There is no easy solution here. Some people seem to think that there must be some magic wand that can be waved that says, "leave up the bad content for good people with good intentions to use to stop that bad behavior, but block it from the bad people who want to do bad things." But... that's not really possible. Yet, if we're increasingly demanding that these centralized platforms rid the world of "bad" content, at the very least we owe it to ourselves to look to see if that set of decisions has some negative consequences -- perhaps even worse than just letting that content stay up.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: content moderation, content moderation at scale, hacking, hacking videos
Companies: youtube
Reader Comments
The First Word
“Re: If impossible, cut them down to size. No right to exist at a
(during good behavior by common law terms)
Couldn't pass that one up - What law are they violating again? Please cite!
And to be clear - are you saying:
"Google is breaking the law"
or
"Corporations - ANY corporations - should be banned under the premise of Cabbage Law."
Go on - I really want to hear this one!
Subscribe: RSS
View by: Time | Thread
Wow, out of all of the companies that could possibly fail to understand Kerckhoff's Principle, you really wouldn't think Google would be one of them!
For those who aren't familiar with computer security, Kerckhoff's Principle is one of the most counterintuitive, yet most important fundamental principles of the entire field: "the enemy knows the system." It means that any discussion of information security is not valid if it doesn't begin with the ground-level assumption that the bad guys already know every detail of how your system works, and therefore if you aren't secure even with that knowledge being out there, you aren't secure period.
So what does that mean in this context? It means the bad guys already know about hacking--and probably not from YouTube. Taking hacking information off of YouTube isn't going to shut down any attacks. What it will do, as Professor Halderman pointed out, is exactly the same thing that people using secrecy and obscurity in defiance of Kerckhoff's Principle always accomplish: it makes it more difficult for the good guys to level the playing field.
Cliff Stoll made the same basic point in his classic book The Cuckoo's Egg, which described in detail the techniques that a hacker used to break into his computer network and several others: he didn't feel any qualms about publishing this information because the "people in the black hats" already know this stuff, and teaching everyone else about it allows them to be better informed and better able to defend against such attacks.
Frankly, in the light of Kerckhoff's Principle alone this decision doesn't add up, but it just looks worse when you consider research that suggests that approximately 89% of people are basically honest. Any intelligent admin who knows that there are somewhere in the neighborhood of 8 good guys for every bad guy would want to do everything possible to recruit and empower them, rather than keep them in the dark!
[ link to this | view in chronology ]
"bypassing secure computer systems"
Doesn't DeCSS do this?
Does this mean any video mentioning it should go?
Does this mean any videos from his other companies should go?
[ link to this | view in chronology ]
Re:
That is what the premise was years ago.
[ link to this | view in chronology ]
Re:
"Doesn't DeCSS do this?"
Every type of DRM does exactly that, yes. There's plenty of good reasons why "hacking" DeCSS more closely resembled building an antivirus template.
[ link to this | view in chronology ]
If impossible, cut them down to size. No right to exist at all.
Since by your notions, mere users don't have any right to use the platform (during good behavior by common law terms), then you can't argue that you're for The Public. So X that basis out...
Then you're just as always arguing for corporate profits with ZERO responsibility.
Again round on this! Can't you come up with any topic NOT blatantly pro-corporation propaganda?
NO, because you're paid by Silicon Valley capital and Evil Central to spew this view:
https://copia.is/wp-content/uploads/2015/06/sponsors.png
[ link to this | view in chronology ]
Re: If impossible, cut them down to size. No right to exist at a
Copyright is anti-user and anti-creator.
[ link to this | view in chronology ]
Let’s discuss corporations using copyright to censor the speech of third parties. How’s that for anti-corporation propaganda, hmm?
[ link to this | view in chronology ]
Re:
There are lots of things that are impossible. That doesn't mean we should ban those things from existence. Human flight was once impossible, do you want to ban that too?
Rights vs. privileges, learn the difference. No, no one has the "right" to use platforms, but they do have the "privilege". However, as like young child with a toy, that privilege can be taken away if they misbehave. Rights cannot.
The public has a right to free speech, free from government oppression. They do NOT have a right to use any private platform they want to while violating the rules established by said platform.
And you still don't understand freedom of speech does not mean guaranteed access to online platforms.
I completely agree.
Can't you come up with something other than sheer idiocy and conspiracy theories?
No he's not and you continuing to link to that image doesn't make it true, no matter how much you want it to.
[ link to this | view in chronology ]
Re: If impossible, cut them down to size. No right to exist at a
(during good behavior by common law terms)
Couldn't pass that one up - What law are they violating again? Please cite!
And to be clear - are you saying:
"Google is breaking the law"
or
"Corporations - ANY corporations - should be banned under the premise of Cabbage Law."
Go on - I really want to hear this one!
[ link to this | view in chronology ]
Irony, I guess Google Project Zero won't post to YouTube
Seriously, Google's Project Zero disclosure policy has caused quite a few headaches for releasing information before patches. Some I agree need to be released, some should be delayed a bit for a patch to be QC'd before release, and thier own extension request policy has been most of the issues. If you are not familiar: https://googleprojectzero.blogspot.com/
Now I guess if someone else releases a disclosure on Google's platforms they will be banned, but not if you are in their own security department?
[ link to this | view in chronology ]
"Instructional hacking and phishing: Showing users how to bypass secure computer systems"
How secure is it when you can so easily bypass? They want you to think it is secure, but it is not.
[ link to this | view in chronology ]
You gotta hand it to Alpha/GoogleYouTube, they're just reading from the same page in the same book as the SESTA/FOSTA protagonists (gubbermint) - "if you can see it, you can make it go away by making sure that on one else can see it". We're all familiar with how that's working out, right?
Similar to that are the gun control activists (my inner warrior wants to use a much more derisive adjective here) who believe that if you can't buy a gun, then you can't commit a crime. I'm not even gonnna ask for a break here, I'm just gonna go find some cleaner air space that doesn't contain so much wasted oxygen.
But there is one saying in the pro-gun community that should be co-opted into the computer security field, and that is: "An armed society is a polite society." I'd express it thus: "A knowledgable computer owner owns a safe computer". That goes for everyone from individuals all the way up to the top of the ladder. And it's the very bottom-most underpinning of my personal computing philosophy: I practice safe hex. No one else can do that for me, it's my fault if I get taken down/out, and no one else's.
Ya know, after a few moments in review (before hitting Submit), I've come to realize something.... why is it that we can (have to) have all kinds of oversight, which I read as Big Brother-ism, and yet when things go wrong, we can't sue those entities, public or private, that "promised" we'd be safe if we just follow their instructions? Seems like a lop-sided way of doing things, eh?
sumgai
[ link to this | view in chronology ]
Re:
"An armed society is a polite society."
If that were true the gun-related body count would be a hell of a lot lower.
You can't just stroll into Tesco and buy a gun over here in the UK and we are generally a polite society.
US gun deaths this year alone: https://www.gunviolencearchive.org/reports/number-of-gun-deaths
UK gun deaths 2017-2018: https://www.ons.gov.uk/peoplepopulationandcommunity/crimeandjustice/bulletins/crimeinenglandandwales /yearendingdecember2018#offences-involving-knives-or-sharp-instruments-are-still-rising-while-firear ms-offences-decrease
Even allowing for population differences access to firearms is the issue. It's much easier to defend yourself against a knife than a gun:
https://en.wikipedia.org/wiki/Wolverhampton_machete_attack
[ link to this | view in chronology ]
Re: Re:
"It's much easier to defend yourself against a knife than a gun"
Especially when you have the gun
Ask any LEO
[ link to this | view in chronology ]
Re: Re: Re:
Ask any nursery worker. And if you need guns to make society polite, there's something very badly wrong in your society.
[ link to this | view in chronology ]
Re: Re: Re: Re:
Does your Country have a military force?
Instead of telling me how I should live, how about I tell you how you should live?
[ link to this | view in chronology ]
Re: Re: Re:
Ask anyone who was at a certain concert in Las Vegas if they'd rather have had the guy throwing knives from that window instead, or if it would have been easier if the crowd were armed with more guns.
[ link to this | view in chronology ]
Re: Re: Re: Re:
If the crowd were armed with guns and he was throwing knives at them, there would have been a different outcome, yes.
It is easier to defend yourself with a gun than a knife
You one of those guys who takes a knife to a gun fight? I guess we'll know the first time you try it.
[ link to this | view in chronology ]
Re: Re:
"If that were true the gun-related body count would be a hell of a lot lower. You can't just stroll into Tesco and buy a gun over here in the UK and we are generally a polite society."
I keep saying both sides of the gun vs gun control debate are a bit wrong in their basic assumptions...
Switzerland and Sweden both have far more heavy-duty guns (assault rifles and hunting rifles respectively) than the US does, per capita. Yet we rank very low on gun-related murder.
Mexico city has some of the most draconian gun control laws in the world, and washington D.C. has the most rigorous gun control law in the US. In both cases these places stand out when it comes to gun-related murder.
I think you'll find that the best correlation to murder isn't the prevalence of guns, but the state of mental health in society. In the UK it's difficult to be born into hopelessness. In the US if you've born in, say, Flint, odds are good you were literally born to lose. With much of the population in a fortress mentality towards some other part of the citizenry the gun simply becomes a more convenient killing/defense tool.
Add to that the mythology in the US of the gun being the "great equalizer" and other, similar catchphrases and you end up with large parts of the citizenry being not only aggrieved but convinced that holding a Glock will at least ensure they get a modicum of respect.
The entire debate of gun control vs the right to bear arms is irrelevant when the background situation is that of a low-intensity war.
[ link to this | view in chronology ]
Wait a second...
Just last week I was told that YouTube can host or not host whatever it does or does not want to, and I had to just shut up and take it. Complaining was forbidden! If I didn't like it, I could take the half a trillion dollars I have lying around and go build my own video hosting platform.
Now there's a whole article on Techdirt bemoaning the fact that YouTube is doing what so many last week said they have every right to do.
So which is it?
[ link to this | view in chronology ]
Re: Wait a second...
The point is there is a difference between complaining and saying that they should be forced to host it.
It is like saying that a publisher who refuses to print the letter "I" shouldn't be illegal but is fucking stupid as a policy.
[ link to this | view in chronology ]
Re: Re: Wait a second...
There was no problem hosting Crowder until Maza whined.
Their Company. Their way. The end.
John Snape, Techdirt wasn't saying that, your comment needs to be directed to the peanut gallery, they are the culprits you seek.
Techdirt is saying what it says in this article
"...at the very least we owe it to ourselves to look to see if that set of decisions has some negative consequences -- perhaps even worse than just letting that content stay up."
Same as it did in the post about backpage very recently
Techdirt has always said this, to the best of my knowledge
Lets go Trigger. The bigots are cumming down the slope
[ link to this | view in chronology ]
Re: Wait a second...
World of difference between 'Shouldn't' and 'Shouldn't be allowed to'.
There's no conflict between arguing against the idea of forcing private platforms to host content they don't want to and also pointing out when they make a bad call.
[ link to this | view in chronology ]
Re: Wait a second...
Agreed. If you accept the premise that Youtube has full control over the content it allows on its platform, there is no such thing as a "bad call". they can ban whatever the hell they want.
If you don't accept that premise, then you are accepting that there has to be some check on Youtube's ability to police its own site. And if you accept that premise, it's better to have a formal process than to just rely on social pressure from influential users to dictate what is "proper".
[ link to this | view in chronology ]
Re: Re: Wait a second...
Edit: If Youtube had an appeal process that actually worked for the average user, that could suffice. But that's impossible, because it just kicks the "content moderation at scale" can down the road: eventually the bad actors will just start appealing everything and overwhelm the appeals process.
[ link to this | view in chronology ]
Re: Re: Re:
And trying to moderate all the content on their site with 100% accuracy is no different, it's completely overwhelming. Hence the point of this article.
So tell me, why is moderating billions of users who make multiple posts a day somehow magically possible but an appeals process for far less DMCA requests is completely overwhelmed and impossible? Hm?
[ link to this | view in chronology ]
Counterpoint: Content takedowns based on false/invalid DMCA notices.
[ link to this | view in chronology ]
Re:
So what? They took it down, they find it easier than trying to determine the truth.
Their company. The end
You are free to use another site or build your own
[ link to this | view in chronology ]
Re: Re:
" So what? They took it down, they find it easier than trying to determine the truth. Their company. The end"
Yeah, but the problem with the specific situation you responded to is that the legal background renders youtube liable if any of the takedowns they don't respond to turns out to be real.
So they are basically intimidated into playing by rules their own company wouldn't necessarily sanction.
[ link to this | view in chronology ]
Re: Re: Re:
"So they are basically intimidated into playing by rules their own company wouldn't necessarily sanction."
Such as:
https://www.techdirt.com/articles/20190616/02114442409/we-should-probably-stop-blaming-technolog y-failings-human-beings.shtml#c522
[ link to this | view in chronology ]
'Can do' does not always mean 'smart to do so'
Agreed. If you accept the premise that Youtube has full control over the content it allows on its platform, there is no such thing as a "bad call". they can ban whatever the hell they want.
Just because someone has the ability to smash their hand with a hammer does not mean it wouldn't be a 'bad call' for them to do so.
[ link to this | view in chronology ]
Re: Re:
That's actually not how that works. Chess players are well within their rights to play however they want. That doesn't change the fact that putting your king deliberately into a checkmate isn't a bad call. Same applies here, they are within their rights to do it, but that doesn't mean it isn't a bad call and people can't criticize them for it. I mean come on, you're criticizing them for booting off Nazi's, the scum of the earth, so why is it suddenly hypocritical for someone else to do the same thing you are?
No, that's not how it works. Just because they have the right to police their site how they want doesn't mean that everyone has to agree with it. You are saying that everyone's speech (online or offline) should be policed just because you don't particularly agree with them.
No, it's really not. And the reason why is because that formal process by definition is a violation of the First Amendment. As soon as the government starts dictating what speech is or is not allowed, it runs flat into the First Amendment. Why is it so hard for you and your ilk to understand this despite being told innumerable times?
[ link to this | view in chronology ]
Re: Re: Re:
The attemps at framing are transparent and silly, no?
[ link to this | view in chronology ]
Re: Re: Wait a second...
"If you accept the premise that Youtube has full control over the content it allows on its platform, there is no such thing as a "bad call". they can ban whatever the hell they want."
That is completely false. The point of the article is that some of YouTube's actions are arguably bad for society as a whole, such as making it harder to educate the public on how to protect themselves against hacking. Just because they're entitled to do something doesn't mean they can't be legitimately criticized for it. That makes the rest of you comment equally false.
[ link to this | view in chronology ]
Re: Re: Wait a second...
"If you don't accept that premise, then you are accepting that there has to be some check on Youtube's ability to police its own site"
No, it just means that one private group can exercise its freedom of speech to criticise the way another private group chooses to exercise theirs.
"it's better to have a formal process "
Yes, which is why people are criticising the messy and opaque process YouTube currently have in place. That doesn't that people accept that the only alternative is for the government to come in and prevent them from moderating their platform. Stop with that false premise.
[ link to this | view in chronology ]
Re: Wait a second...
Still "YouTube can host or not host whatever it does or does not want to".
It sucks when they choose to block something you or I don't think they should block but it's their platform and they can host or not host at their whim.
[ link to this | view in chronology ]
Stop making strawman arguments and false equivalences that you know are bullshit. You look like a fool in doing so.
[ link to this | view in chronology ]
Re: Wait a second...
Just last week I was told that YouTube can host or not host whatever it does or does not want to, and I had to just shut up and take it.
Your reading comprehension isn't very good is it?
At no point did we tell you to "shut up and take it." We explained why your plan to make use of the law to require companies to host content it did not wish to associate with was a real problem.
This post, on the other hand, is highlightling a specific problem with the nature of moderating content.
These are not mutually exclusive views.
Complaining was forbidden!
No one said that. What we said was bringing in the state to force companies to host content they felt was problematic was, itself, a huge problem. That's not about complaining. Complain away. Just don't bring the government in to force companies to host content they find offensive.
Now there's a whole article on Techdirt bemoaning the fact that YouTube is doing what so many last week said they have every right to do.
Yes, they have every right to do it. We also can point out why it's a bad idea. But that's different from what you said last week, which was that you wanted a law that REQUIRED them to host all content.
Can you really not tell the difference?
So which is it?
One of us is creating a strawman. The other is having an adult conversation.
Figure out which is which.
[ link to this | view in chronology ]
Re:
Well if you could stop misrepresenting and lying about what was said you might have a point. Since you can't, you don't.
No one said any of what you claimed they did. What you want is to have the government FORCE companies to do something. We're not wanting to force them to do anything, but we are pointing out that this particular policy is a fairly bad idea, but they are still within their rights to disagree with us and move forward with it anyway.
Also, note that the article is about how it's impossible to do what you want it to do and as an example brings up this fact where the hacking videos were removed BY MISTAKE. So you're taking an example that unequivocally proves you wrong and arguing that somehow we're being hypocrites. Right.
[ link to this | view in chronology ]
Re: Re:
apparently we are also hypocrites for arguing against his "position". shit is apparently FORBIDDEN if you don't agree with it, or even more outrageously, have a counterargument. freeze peaches, you know.
[ link to this | view in chronology ]
Re: Wait a second...
Both. YouTube should not be forced by government agents to host certain content, but if they privately choose to block content in ways that we believe is not the correct way we can still criticise them for making the wrong decision.
Life is far easier to deal with if you stop trying to apply false dichotomies to everything and learn to deal with shades of grey.
[ link to this | view in chronology ]
Regardless of how it's implemented, the platform owner will always be a single, centralized organization making all the final decisions about what's "good" and what's "bad".
I suppose you could argue that a user could "go find another platform", so it's not really centralized. But again, that argument applies for extremist videos as much as it applies for educational hacking.
[ link to this | view in chronology ]
Re:
And that's your first mistake. They aren't making final decisions about what's "good" or "bad". They are making decisions about what they do and do not want to allow on their platforms. A Star Wars fan site has every right to ban all fan discussions about Star Trek. This is no different, just on a larger, more inclusive scale.
And that's your second mistake. Facebook, Google, Twitter, etc... are not the internet. The internet by design is decentralized. It's impossible for ANY single company to control everything that goes on online. So yes, a user can just go find another platform, or they can make their own. You can create a free blog in 10 minutes, or with maybe a day or two's worth of work, you can stand up your own Mastodon instance and create your own social media platform.
Yes it does, but nothing says we can't criticize Youtube for a decision we don't agree with. You want to take that decision away from them.
[ link to this | view in chronology ]
Re:
"But again, that argument applies for extremist videos as much as it applies for educational hacking."
It does. Then, the people who don't want to host extremist videos are free to refuse them, while authorities have a nice place to go and see who is saying what without monitoring thousands of hours of non-extremist content every day to find the potential terrorists since they've already self-identified. Win/win.
[ link to this | view in chronology ]
YouTube's new policy will do nothing to stop bad guys, but it will definitely make it harder for the public to learn about security.
Plenty of recent decisions - and plenty of less recent decisions - are aimed at making the public complacent and ignorant about security.
How else are the NSA going to get their backdoors without the public throwing a well-deserved shitfit?
[ link to this | view in chronology ]
Re:
sure, that must be the reason for youtube's (or whomever's) moderation choices.
[ link to this | view in chronology ]
YouTube makes lots of "mistakes" now and they are fooling noone
There's so much data showing the reality distortion YouTube does in the USA for the big media outlets that aren't Fox. A CNN story that nobody views and might get 500 likes will trend when PewDiePie doesn't. The major media outlets that aren't Fox (fox never trends on youtube now) control YouTube now for Google.
[ link to this | view in chronology ]
Re: YouTube makes lots of "mistakes" now and they are fooling no
"A CNN story that nobody views and might get 500 likes will trend when PewDiePie doesn't"
If you think that latter is a news organisation, you have bigger problems that anything YouTube can handle.
[ link to this | view in chronology ]
Hi! You made an absolutist statement wherein one example can prove you wrong. Here’s that one example. (It currently sits at #40 on Trending, but still, it’s there.)
By all means, share it.
[ link to this | view in chronology ]
Re:
And since there's so much data I'm sure you can provide some links to it and aren't once again lying. RIGHT?
Videos don't trend if they aren't viewed. That's kind of the definition of "trending", is that people are watching them. So please, do explain how a video that "nobody views" gets enough views to start trending. I'll wait.
You want to explain that a bit better? How exactly do they control them? Hm? Have they been given server admin access? Direct access to the website code? Please, do explain EXACTLY how they control Youtube.
As Stephen points out, making absolutist statements is risky business. One example to the contrary and your entire argument goes up in smoke. Oh hey, look, smoke!
[ link to this | view in chronology ]
Re: Re:
"So please, do explain how a video that "nobody views" gets enough views to start trending. I'll wait."
I assume he'll just say that YouTube are making the numbers up, because we all know they get paid massively for ads that nobody clicks on or something.
"You want to explain that a bit better?"
Fox fans are told that they are simultaneously the most popular and most trustworthy news source, while they also rail against "mainstream media" for putting down the underdog. Anyone with the ability to believe all of this at the same time must get very confused when faced with the outside world. You can usually tell, because they'll start ranting about CNN, even though most people who don't watch fFox also don't watch that.
Sadly, this seems to be driving them to bigger fiction writers and more extreme content than realising they're being lied to.
[ link to this | view in chronology ]
Pet peeve
Why the hell dont people link to the actual thing rather than someone talking about said thing.
Case in point. The video gets 'unblocked'
"was a mistake and the videos have been restored"
was the text, and the link goes to some article at the verve rather than to the video.
Meh.
[ link to this | view in chronology ]
Perspective
How different would this have been if rather than saying he had made a video about wifi-enabled July 4th fireworks he had described the same video as explaining how to remotely ignite incendiary devices using cheap and readily available components?
[ link to this | view in chronology ]
Terrible Idea
First off this is really dumb! If we don't know if exists or how it's done now do we avoid it or counter it?
YouTube, Facebook et al, and Twitter have become so pervasive they are no longer publishers but public forms!
[ link to this | view in chronology ]
Re: Terrible Idea
No matter how many times you say that, it doesn’t make it true. Although it does expose the laziness of people who would rather have the government force people to cater to them than actually put some effort into what they choose to consume
[ link to this | view in chronology ]
Re:
Becoming "so pervasive" does not make it a public forum. Being run by the government makes it a public forum. As far as I'm aware, the government doesn't own or run Twitter. It is a private company, which makes it NOT a public forum and NOT subject to First Amendment restrictions.
Also, I recommend taking a class in English. Your spelling and grammar is atrocious.
[ link to this | view in chronology ]
Re: Terrible Idea
Oh dear. The ill-informed catchphrases are breeding and mutating.
[ link to this | view in chronology ]
Re: Re: Terrible Idea
So you're saying, that isn't even their final form?
[ link to this | view in chronology ]
Re: Terrible Ide
Well pervasive sloped by me, I could be a liberal and blame my spelling corrector/auto-complete, but I'll be an adult and accept responsibility. Prevailant is the right term. If you get 89% of the attention are you a platform or a public form? Address the issue not your goofy prescriptions ( I only have mature cincerns!).
[ link to this | view in chronology ]
Re: Re: Terrible Ide
I wasn't referring to your choice of words, I was referring to the fact that you can't spell them or use proper grammar.
Pervasive or "prevailant" (I'm assuming you meant "prevalent" here, since "prevailant" isn't a word) makes no difference, it's still not owned/run by the government so it's still not a public forum. It is literally irrelevant how popular or how widely used it is.
It depends on if you are owned/run by the government. If you are owned/run by the government then you are a public forum (which is the correct term, not form). If you aren't run by the government then you are a private platform/forum/form, no matter how "prevalent" and widely used you are.
I did.
My doctor says I'm in good health and has not prescribed me any medications.
I assume you meant to say "concerns"? Again, I recommend an English class so that you can learn to spell properly.
[ link to this | view in chronology ]
Re: Re: Terrible Ide
Techdirt has a number of posts under the public forum tag that answer your question; you'll have an easier time finding information on what a public forum is if you learn how to spell it correctly.
The short answer is, a privately-owned platform is not a public forum. It can become what's called a limited-purpose public forum, if public officials use it for official business, but that does not mean the entire platform is a public forum. No matter how big its userbase.
You appear to be combining the ill-informed "public forum" talking point with the ill-informed "publisher, not a platform" talking point. Techdirt has covered that at considerable length too; here are two recent posts to get you started:
Once More With Feeling: There Is No Legal Distinction Between A 'Platform' And A 'Publisher'
Explainer: How Letting Platforms Decide What Content To Facilitate Is What Makes Section 230 Work
[ link to this | view in chronology ]
Security through obscurity
If your security can be bypassed, it's not really secure. Hiding knowledge of how it can be bypassed won't make it secure.
[ link to this | view in chronology ]
I am trying to rad a an issue. That, unlike newspapers some of the big guys: Facebook et al, YouTube, Twitter, are so prevalent and dominate to such a degree that they are in fact public forms!
Looking at the EUs' massive failure enforces what a bad idea moderation is. Their moderation requirements have driven almost all the smaller platforms out leaving only the very big guys that the essentially anti-American regs were apparently aimed at.
It can be argued that a lot of user contributions are junk, disinformation, or spam. But I find that user reviews are useful if taken with a grain of salt. If size matters and fairness is the goal then very limited or no moderation is the answer. If we are to allow platforms as private enterprise then "fairness" and good moderation requirements become silly!
[ link to this | view in chronology ]
Re:
Incorrect. The only thing that makes a forum public is whether or not it is owned/operated by the government. Facebook, Youtube, Twitter, et al are not, therefore they are not public forums.
No, the EU's massive failures show what a bad idea government interference in free speech is. Moderation is a form of free speech.
You mean free speech restrictions and taxes.
Way to just dismiss the value of the entire human race. No wonder the world doesn't like your view.
It doesn't.
Depends on your idea of fair. Is it fair to tell people what they can and cannot say on their own property? If you think so then no, fairness is not the goal.
Well since that's NOT the goal and fairness is actually letting companies decide what kind of experience they want their users to have and moderating to achieve that, then moderation IS the answer.
And the alternative is....?
The only thing silly here is your lack of understanding that moderation is part of freedom of speech and is the ONLY way to keep platforms from becoming a cesspool of vile and disgusting thoughts and ideas cluttering up the space.
[ link to this | view in chronology ]
I understand moderation as a type of free speach. The public forum came about because of government restraining free speech in public places shutting it down .
My contention that some of these platforms have become so large that they are like the public spaces of old. Denying access in fact denies the right of free speach. This is just like, in the distant past denying access to the town square denied me the ability to express my views.
I 'd like to hear some substantive comments not just reiterating the facts. Do we need to change public policy and should we? The EU demonstrates some of the problems.
"Way to just dismiss the value of the entire human race. No wonder the world doesn't like your view."
So now you say there are no bots, SPAM, ... and every comment is gold. Then we obviously don't need moderation and should ban it!
You can't or don't read very well! My comment was "...If fairness is the goal". But you say it has no place! I agree, how do we define 'fairness'. I don't believe it's possible. Just like defining hate speech etc.
". ..ONLY way to keep platforms from becoming a cesspool of vile and disgusting thoughts and ideas cluttering up the space.'. So now you are for censorship!
It appears that we have are two choices:
First, accept that "...cesspool of vile and disgusting thoughts and ideas cluttering up the space." And learn to deal with it OR
Second accept accept arbitrary capricious censorship as free speech (today's situation).
Neither is really that good but I agree that moderation as free speech is the lessor of two evils. In this case we need to strengthen 230 and intermediate liability! The persecution of Back Page is a good reason why! We need to ban foreign governments and entities form interference with American free speech as the EU and Indian government have recently done. We need to rather embrace 230 or throw it out not waffle like we are doing now!
[ link to this | view in chronology ]
Re:
“My contention”
It doesn’t matter how many times you contend it, you’re factually wrong. The rest of your comment reinforces the fact that you don’t know what you’re demanding on to of that.
[ link to this | view in chronology ]
Re:
This has been explained to you many times before but apparently it has not sunk in:
A space/forum is only public, in regards to the First Amendment, when it is owned and/or operated by the government. A privately owned social media platform is not and never will be.
Yes, government owned public spaces, like a town hall, town square, etc... A privately owned convention center or amphitheater (or social media platform) is not the same thing.
You can contend all you like but that won't make it true. Size has absolutely nothing to do with whether the space is "public" or not, with regards to freedom of speech and the First Amendment.
Only if owned/operated by the government. I can, for example, host a block party in my backyard and have you removed by force for spouting your nonsense if I so choose.
Again, the town square is owned/operated by the government. Social media platforms are not.
You have, you just apparently don't like it. Not to mention the facts say you are wrong. Facts are objective, comments not based in facts are subjective. Are you implying that you don't like the facts and want more people to join you in denying reality?
No and no, for reasons obvious to everyone but you.
Yes, they are an excellent example of what happens when the government tries to restrict the freedom of speech of people and companies, including social media. Let's not do that here ok?
Let me set some fire to that strawman you've constructed there. The original statement to which I wrote that reply states the following (bolding mine):
Given that, I took his comment to mean these submissions are being done by actual human beings, voicing their opinions, statements, point-of-views, content, etc... NOT auto-generated content by bots, that are generally not considered users. I did not address, nor imply, whether or not some content was generated by automated means and the veracity of said content. Do not put words in my mouth.
That's funny coming from you, since I just got done explaining how you didn't properly read the context of my comment.
You know, lying about what I said just one comment up is not a good look. I said it depends on what you consider to be fair, then gave two example scenarios, clarifying that.
Well then, if you can't define it then it shouldn't be made into laws then should it? If you think it similar to hate speech, then you should be in favor of not making laws dictating what is or is not "fair", and instead it should be left up to the platforms and users to decide. Hm?
Moderation is not censorship, and nowhere did I state that.
Newsflash, moderation is one method of dealing with it.
I'm sorry, I wasn't aware the government was already telling me what I can and cannot say online, since that is the definition of censorship. Private content moderation is not censorship, no matter how many times you lie about it. It's also not arbitrary and capricious. The terms of use are clearly laid out when you sign up to use social media and you have to click the "I agree to abide by these terms" when you sign up. Since you agreed to abide by the rules, you have no right to complain when you get booted off for breaking them.
Only because you don't understand what you are talking about.
It's not an evil, it IS as much free speech as me stating my opinion. And if you agree, then why are you arguing against it?
Those things are in most cases mutually exclusive. 230 says the platforms shouldn't be blamed for actions of their users. Strengthening that means you can't hold them responsible in intermediary liability situations. You need to go back and re-educate yourself on these laws because it is painfully obvious you don't understand any of this.
It was shown that BackPage was actually catering, or at least turning a blind eye, to sex trafficking, which is illegal. That's why they got taken down. If they had stepped up better, they would still be around.
This is an absolutely stupid and moronic statement. We need to do no such thing since foreign countries CAN'T interfere. Americans aren't subject to foreign laws, so aside from sending cops and troops over here (which would be an invasion and act of war) how the hell would they even do this in the first place? You really have no concept of how the world works.
[Citation needed.]
Duh, we already have. That's why it's a law and courts have upheld it. You started this post out arguing against 230, now you are for it. Pick one.
Then tell the idiots in government to stop being morons and technological illiterate luddites and STOP trying to undo 230 protections.
[ link to this | view in chronology ]