Three Lessons In Content Moderation From New Zealand And Other High-Profile Tragedies
from the no-magic-wands-available dept
Following the terrorist attacks on two mosques in Christchurch, New Zealand, social media companies and internet platforms have faced renewed scrutiny and criticism for how they police the sharing of content. Much of that criticism has been directed at Facebook and YouTube, both platforms where video of the shooter's rampage found a home in the hours after the attacks. The footage was filmed with a body camera and depicts the perpetrator's attacks over 17 minutes. The video first appeared on Facebook Live, the social network's real-time video streaming service. From there, Facebook says, it was uploaded to a file-sharing site, the link posted to 8Chan, and began to spread.
While the world struggles to make sense of these horrific terrorist attacks, details about how tech companies handled the shooter's video footage and written manifesto have been shared, often by the companies themselves. Collectively, these details in combination with the public discourse on and reaction to, as the New York Times referred to it, "a mass murder of, and for, the internet," have made clear three fundamental facts about content moderation, especially when it comes to live and viral content:
1. Automated Content Analysis is Not a Magic Wand
If you remember nothing else about content moderation, remember this: There is no magic wand. There is no magic wand that can be waved and instantly remove all of the terrorist propaganda, hate speech, graphically violent or otherwise objectionable content. There are some things that automation and machine learning are really good at: functioning within a specific and particular environment (rather than on a massive scale) and identifying repeat occurrences of the exact same (completely unaltered) content, for example. And there are some things they are really bad at: interpreting nuance, understanding slang, and minimizing discrimination and social bias, among many others. But perfect enforcement of a complex rule against a dynamic body of content is not something that automated tools can achieve. For example, the simple change of adding a watermark was enough to defeat automated tools aimed removing video of the New Zealand shooter.
Some, then, have suggested banning of all live video. However, that overlooks activists' use of live streams to hold government accountable and report on corruption as it is happening, among other uses. Further, the challenges of automated content analysis are by no means limited to video. As a leaked email from Google to its content moderators reportedly warned: "The manifesto will be particularly challenging to enforce against given the length of the document and that you may see various segments of various lengths within the content you are reviewing."
All of this is to reiterate: There is no magic wand and there never will be. There is absolutely a role for automated content analysis when it comes to keeping certain content off the web. Use of PhotoDNA and similar systems, for example, have reportedly been effective at ensuring that child pornography stays off platforms. However, the nuance, news value, and intricacies of most speech should give pause to those calling for mass implementation of automated content removal and filtering.
2. The Scale, Speed, and Iterative Nature of Online Content – Particularly in This Case – is Enormous
It is a long-standing fact of the internet that it enables communication on a vast scale. Reports from YouTube and Facebook about the New Zealand attack seem to indicate that this particular incident was unprecedented in its volume, speed, and variety. Both of these companies have dedicated content moderation staff and it would be easy to fall into the trap of thinking that this staff could handily keep up with what seems to be multiple copies of a single live video. But that overlooks a couple of realities:
- The videos are not carbon copies of each other. Any number of changes can make identifying variations of a piece of content difficult. The iterations could include different audio, animation overlays, cropping, color filters, use of overlaid text and/or watermarks, and the addition of commentary (as in news reporting). Facebook alone reported 800 "visually distinct" videos.
- There is other content – the normal, run-of-the-mill stuff – that continues to be posted and needs to be addressed by the same staff that now is also scrambling to keep up with the 17 copies of the video that are being uploaded every second to that single platform (Facebook in this case; YouTube's numbers were somewhat lower, but still reaching one video upload every second, culminating in hundreds of thousands of copies).
It's worth noting here that not a single person reported the live video stream to Facebook for review. The video was reportedly viewed "fewer than 200 times" while it was live but the first report came 12 minutes after the stream ended – a full 29 minutes after the broadcast began. That's a lot of time for a video to be shared and reposted by people motivated to ensure it spread widely, not only on Facebook, but on other sites as well.
In addition to proving the challenges of automated content review, the New Zealand attacks demonstrated weaknesses of the companies' own systems, particularly when dealing with emergencies at scale. YouTube, for example, was so overwhelmed by the flood of videos that it opted to circumvent its standard human review process to hasten their removal. Facebook, too, struggled. The company has a process for handling particularly sensitive content, such as an individual threatening to commit suicide; however, that process wasn't designed to address a live-streamed mass shooting and likely could not easily be adapted to this emergency.
3. We Need Much Greater Transparency
As non-governmental and civil society organizations have hammered home for years, there needs to be more transparency from tech companies about their policies, processes, and practices that impact user rights. One of the more promising developments from 2018 in this space was the release of reports by YouTube, Twitter, and Facebook providing a quick peek under the hood with respect to their content enforcement. While there is still a long way to go from the first year's reports, the reaction to their publication shows a hunger for and deep interest in further information from tech companies about their handling of content.
Among companies' next steps should be transparency around specific major incidents, including the New Zealand attacks. Social media platforms are still reeling from over a week of whack-a-mole with a heavy side of criticism. But once they are able to identify trends or data points across the incident, they should be shared publicly and contextualized appropriately. For example, how did Facebook identify and handle the 800 distinct versions of the video? Did those include uses of the video in news reporting? How was the Global Internet Forum to Counter Terrorism – an entity formed to share information on images and videos between companies – engaged?
One challenge for companies when providing transparency into their policies and practices is doing so without providing a roadmap to those looking to circumvent the platforms' systems. However, extant transparency reporting practices – around, for example, government requests for user data – suggest companies have found a balance between transparency and security, tweaking their reports over time and contextualizing the data within their larger efforts.
What's Next?
There are no quick fixes. There are no magic wands. We will continue to debate and discuss and argue about whether the tech companies "did the right thing" as they responded to the New Zealand shooter's video, his manifesto, and public reaction to both. But as we do so, we need transparency and insight into how those companies have responded, and we need a shared understanding of the tools and realities of the problem.
As details have emerged about the attacks in New Zealand and how they played out on social media, much of the analysis around internet companies' handling of user content has fallen into one of two buckets:
- Tech companies aren't doing enough and refuse to use their enormous money/power/tools/resources to address the problem; or
- The problem is unsolvable because the volume of content is too great for platforms to handle effectively.
The problem with presenting the issue as this dichotomy is that it overlooks – really, completely ignores – the fact that an unknown number of viewers watched the live video but did not report it to Facebook. Perhaps some viewers were confused and maybe others believed it was a joke. But the reality is that some people will always chose to use technology for harm. Given that fact, the question that will ultimately guide this debate and shape how we move forward is: What do we want our social media networks to be? Until we can answer that question, it will be hard, if not impossible, to address all of these challenges.
Reposted from the Center for Democracy & Technology
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: ai, automation, content moderation, new zealand, scale
Reader Comments
Subscribe: RSS
View by: Time | Thread
Step One: stop blaming the messenger
[ link to this | view in thread ]
Filtering stuff like this means keeping people in the dark. If we feel it is bad speech and bad actions, then perhaps the governments and people who have a problem with information such as this being made available on the internet should form counter-speech squads.
If you are not exactly likely to fall under the influence of your enemy's propaganda, it is far more likely to reinforce your opinion against them or galvanize you to action. And when that happens, violent fringe groups or potentially sympathetic people get to see just how many people are not down with their shit. Then they can really whine about how they are discriminated against.
[ link to this | view in thread ]
Re:
It appears you are arguing the video should remain online. Do you agree that there is a limit? If so, what is it?
I fail to see the validity of an argument for not removing some content that is just wrong. Now, define what that is ... I know it when I see it. That doen't quite cut it huh. Can't please all the people all the time - blah blah.
What about kicking puppies? Should that type of video remain online? Why?
[ link to this | view in thread ]
Re:
They are no longer "just the messenger". They are a for profit courier service that knowingly promotes their services to the extremes of society.
[ link to this | view in thread ]
Re: Re:
By that logic so do car manufacturers, gun manufacturers, knife manufacturers, fertilizer manufacturers, chemical supply companies (wherein you can buy enough nuclear material to make a bomb without ever flagging the authorities), etc..., etc...
They knowingly promote their services to the masses, aware that some people will use them for purposes they did not intend. This is the same for ANY company. You cannot blame one company for this without also blaming them all.
Whether you choose to blame them all or just this one, what do you suggest they do about it? Stop all marketing of their services altogether and hope somebody finds their services and uses it? Public marketing is just that, public. You can't engage in marketing and not have people who intend to misuse it not see it.
[ link to this | view in thread ]
Re: Re:
Hi Poochie!
[ link to this | view in thread ]
Too Big, Should Fail!
This... Last bucket, "The problem is unsolvable because the volume of content is too great for platforms to handle effectively."
Use the FCC model prior to Micheal Powell, local content rules, break ownership up if they cross regions or have more properties in said region. For Facebook it would mean not owning instagram, not having news feeds that go across state lines, content derived locally. National content comes from approved trusted sources.
Reality - not gonna happen, but sure would be a boon to local news services, mom and pop shops and the mainstreet you see evaporating.
[ link to this | view in thread ]
I'd say this and the last article are just two facets of the same jewel (or, if you prefer, two flakes off the same cow pie.) "Nerd harder" to fix any human problem is always and only a recipe for failure--like the old tribal custom of killing the medicine man if the patient died, it may briefly satiate the relatives, but it does nothing to reduce the mortality rate.
[ link to this | view in thread ]
Re: Re:
Taking it down does nothing to help the puppy, keeping it up lets us know who the puppy-kickers are and provides evidence against them.
[ link to this | view in thread ]
Re: Re: Re:
All those industries, they have laws that apply to them. Safety requirements for cars, reporting of large sales of fetilizer etc. Social media has none.
[ link to this | view in thread ]
A lot of snowflakes here
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Says the snowflake.
[ link to this | view in thread ]
Nobody cares you fucking nerds.
[ link to this | view in thread ]
Re: Re:
They are still just a private company carrying other people's messages.
They are a for profit courier service that knowingly promotes their services to the extremes of society.
Please cite how Facebook is marketing to nazi's?
[ link to this | view in thread ]
Re: Re: Re: Re:
None of which has anything to do with how they promote their products, which was your original point.
Please show me how you can beat someone over the head with a social media post and directly cause physical injury or death and then we can talk.
And there are also laws that apply to social media companies and other internet companies. If you think otherwise I suggest you go read a law book. They also all have rules banning this type of content from their platforms, so your argument fails by default.
[ link to this | view in thread ]
Old man is irrelevant
There is no such thing as "too big". Define "too big".
And how do you propose to break up Facebook's core service? Or Reddit? Or Youtube?
Fair enough. Though I disagree.
Pffftt. HAHAHAHAHAHAHA!!! So tell me genius, I live in State A, the rest of my family lives in State B. Do pray tell how I would get their status updates in my news feed in such a scenario. Come back when you actually know how technology works.
What if I don't want to see my local content? What if my local content sucks? It is an infringement of my rights for the government to define what legal content I am and am not allowed to see in my news feeds.
You do realize that Facebook doesn't actually publish any news, right? They just aggregate it from other sources. So, mission accomplished?
Yes but likely not for the reasons you are thinking of.
If they can't adapt then they should die. I have no sympathy for a legacy company who can't adjust to new technology and new ways of doing things. You want a law that keeps things the same as they were 50 years ago. Well guess what, technology and ways of doing things 50 years ago sucked, was inefficient, unsafe, and EXTREMELY SLOW.
So you can take your old man yelling at kids to get off his lawn schtick and stay off my internet.
[ link to this | view in thread ]
Re: A lot of snowflakes here
Says the Nazi.
[ link to this | view in thread ]
Re:
I'm sorry you think several million people is "nobody".
[ link to this | view in thread ]
Show me an idiot-proof realworld analog
For all these people complaining that 'bad actors' are 'using the internet wrong', look at the real world and tell me any aspect of the real world that is bad actor proof.
Some Examples:
Vehicles -- Europe has had several high profile murders by vehicle. You don't see people blaming Mercedes and GM
Household Chemicals -- Tide Pod Challenges, etc. Need I say more
Prescription Drugs -- Some people need opioids, some people abuse them
Firearms -- Probably the closest analog to the internet in terms of people being up in arms about 'never let any bad thing ever happen with your product'.
I'd rather people understand the problem and just accept that anything we build, people will find a way to pervert it. Every weapon we have created over the course of human history is a testament to that.
[ link to this | view in thread ]
Re:
Your performative apathy is pointless, you know.
[ link to this | view in thread ]
Re: Re: Re:
https://www.theatlantic.com/technology/archive/2017/09/on-facebook-advertisers-can-show-their-ads-on ly-to-jew-haters/539964/
https://theintercept.com/2018/11/02/facebook-ads-white-supremacy-pittsburgh -shooting/
https://www.latimes.com/business/technology/la-fi-tn-facebook-nazi-metal-ads-20190221-sto ry.html
[ link to this | view in thread ]
Re: Re: Re: Re:
To quote Gary: "Please cite how Facebook is marketing to Nazi's".
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
If that is the case then your original counter argument is also false. Cant have it both ways.
You are assuming that the only valid injury is physical.
You are also assuming that inciting someone else to violence is OK if it's done over the internet. It is never OK.
"go read a law book" . If you practiced what you preach you would not be calling it "a law book".
Banning certain users without policing your own policy is no excuse. There is no loss of argument by default.
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
"Nazis" is Garry's words, not mine. He can find his own quotes. And try reading the news articles rather than posting a glib answer.
[ link to this | view in thread ]
Re: Show me an idiot-proof realworld analog
"I'd rather people understand the problem and just accept that anything we build, people will find a way to pervert it. Every weapon we have created over the course of human history is a testament to that."
I assume by that you don't run antivirus software either.
[ link to this | view in thread ]
No, we are assuming that liability for injuries caused by other people should be placed on those people instead of on the tools they may have used to cause those injuries.
[asserts facts not in evidence]
(Wait shit that’s someone else’s schtick…)
[ link to this | view in thread ]
Please cite how Facebook is intentionally and knowingly advertising directly to Nazis/White supremacists.
[ link to this | view in thread ]
Re: Re:
The extremes of society need interaction with normal views in society to de-radicalize them. Extreme people are grown under isolation, detached from normal views. The crusade to isolate everyone into ever more isolated pockets is sealing away time-capsules set to spring forward with ever more extreme views at some time in the future.
[ link to this | view in thread ]
Have to call out New Zealand for their ham-fisted response that completely embraced the shooters states objectives, and carried out those wishes to a T. The shooters manifesto clearly lays out his intent to use the event to force an over-sized response by State to curtail rights of good people. More might be aware of this, but it's apparently illegal for those in New Zealand to read the document for themselves and recognize that the government picked up the ball and ran forward in all the things the shooter hoped to accomplish.
[ link to this | view in thread ]
Re:
Read the third linked news article from the LA Times above. Or you could just ignore it again and pretend it's all OK.
[ link to this | view in thread ]
Re:
" No, we are assuming that liability for injuries caused by other people should be placed on those people instead of on the tools they may have used to cause those injuries."
Here we have the crux of the matter, you see social media as just a tool, without morality, without responsibility, without accountability. It's about time we treated them for what they are, companies that make part of their profit from the misery of others. They need to be regulated.
[ link to this | view in thread ]
Re: Re: Show me an idiot-proof realworld analog
Copyright is being enforced by trolls but that's never stopped you, Jhon.
[ link to this | view in thread ]
No, I do not see social media as a tool without morality, responsibility, or accountability. I see social media as a tool that can be used for good or evil — just like any other tool — and the creators and maintainers of that tool as those who think they are above morality, responsibility, and accountability. Yes, social media does need to be held responsible for its failings; people being bastards regardless of the existence of social media is not one of those failings.
So…we need to treat them like gun manufacturers?
[ link to this | view in thread ]
Re:
Treating them in a way NZ is now treating gun manufacturers, yes.
[ link to this | view in thread ]
Re: Re:
So does that mean that you find the funeral industry completely unacceptable because they profit from human misery? Under that logic hiring someone else to dig a grave should be illegal because it means someone will profit from human misery!
[ link to this | view in thread ]
Re: Too Big, Should Fail!
Sounds like what you want is 'felony interference with a business model' to not be a sarcastic remark but a real legal statue.
[ link to this | view in thread ]
Re: Re: Too Big, Should Fail!
I thought it already was ... an unwritten law enforced as if it were real.
[ link to this | view in thread ]
Re: Re: Show me an idiot-proof realworld analog
Anti-virus software is indeed perverted, was from the start.
Providing a false sense of security is worse than no security at all.
[ link to this | view in thread ]
The Authorities in New Zealand Also Ran up against the imposible task of finding anyone inside Face-Book willing to talk or able to take down the stream. These companies don't want to spend money dealing with complaints, or the public. we shouldn't be to worried if they have to spend more money to deal with complaints
[ link to this | view in thread ]
Re: Re:
I did. Nowhere in that article does it say Facebook is doing what you claim.
So again, to quote Stephen and Gary, please cite how Facebook is intentionally and knowingly advertising directly to Nazis/White supremacists.
[ link to this | view in thread ]
Re: Re:
So... You think that Facebook needs more regulations - but guns need less?
The cognitive dissonance is strong today.
[ link to this | view in thread ]
Re: Re: Re:
Well, unless they are purposely burying people alive, then the funeral industry is not the ones causing the human misery using their "tools" (coffins, cremation, etc.). Unlike the gun manufacture's guns, which some are very definitely used to cause human misery.
[ link to this | view in thread ]
Re: Re: Re: Show me an idiot-proof realworld analog
You are replying to the wrong person.
[ link to this | view in thread ]
A few weeks ago I was listening to a podcast: Tomorrow with Joshua Topolsky, Episode 152. The hosts were discussing the conditions in which Facebook’s moderators work. They get paid garbage wages for the work they do, which psychologically scars them and gets them to start believing in conspiracy theories and other nasty things after looking at so much of it for so long. And this is only to stop a small portion of that nasty content from existing on the platform as there’s no way for them to get it all.
One of the hosts of the podcast, Ryan Houlihan, posited the idea that we should actually be questioning the concept of a site where users upload as much as they want of anything that they want and they’ll then use underpaid workers and ineffective algorithms to sort out just some of it as a healthy, valid business model that a corporation should have. Regarding YouTube, he had this to say: “YouTube has 400 hours of content uploaded a minute. Maybe that’s an impossible thing to moderate and it’s not responsible for a company to allow for 400 hours of content to be uploaded a minute?” He makes a good point.
You posit the question “What do we want our social media networks to be?”. My answer is “Something much smaller and actually manageable rather than the ludicrous, impossible-to-moderate-at-scale-free-for-all social networks and video sites that we have now.”
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
'Are you with a label/studio/publisher? No? Then beat it.'
You posit the question “What do we want our social media networks to be?”. My answer is “Something much smaller and actually manageable rather than the ludicrous, impossible-to-moderate-at-scale-free-for-all social networks and video sites that we have now.”
And when/if someone takes you up on that, and you want to post something only to find out that nah, you aren't in the 'in group' and therefore you get to watch but not participate, maybe then you'll see the problem with that idea.
[ link to this | view in thread ]
We get it, you jerk off to photos of concentration camps.
[ link to this | view in thread ]
Re:
Why should Facebook manage people any better that any government has ever achieved?
The way to fertilize the ground for violence is to isolate small groups of people, because it is people who feel isolated from society, and do not see a future for themselves, that are most likely to fall for the ideas of an extremist leader, and become their human bombs, suicide attackers, and other forms of cannon fodder..
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re:
Oh this is going to be fun.
Really now. So you contend that I can take a social media post and physically hit someone over the head with it? You also apparently contend that social media is being marketed as "completely safe and will never cause harm or offense to anyone". Interesting takes, interesting takes.
Well I'd say my original counter argument is still intact then since none of those things are actually part of reality. Social media markets their platforms as ways to communicate and stay in touch with people all over the world. What part of that is false advertising?
Because you can't legislate speech, which can only directly cause mental and emotional harm. I can't take words I speak, write, or type, and directly physically harm them. The First Amendment protects pretty much everything I say with some EXTREMELY narrow exceptions. There is no law against saying mean, hurtful things.
No, I'm not, nor did I ever state that. But that is completely irrelevant in this topic of conversation. Social media platforms do not actively incite someone to violence over the internet or otherwise. People posting TO social media do that. This is no different than a moron standing on a street corner with a megaphone inciting people to violence. You don't remove the street corner to stop the moron.
Do tell. What should I call it then? A textbook of law? A law textbook? A book containing the entirety of US law? A book of law? What part of that doesn't make sense to you? You're grasping at straws if that's all you got. The fact of the matter is the law says you're wrong and if you read any book about US law, it would tell you the same thing. The specific words I use to describe said books are irrelevant
I'm sorry, I think you are confused. The users get banned for violating the policies. That is, by definition, policing your own policy. Now, that is not to say that some social media companies have in the past seem to have enforced their policies in a confusing manner, but they HAVE enforced them. Saying otherwise is blatantly ignorant.
You can deny it all you want but your argument floats about as well as a lead colander. When your argument is predicated on false statements and logical fallacies, you fail by default. And all of your statements are easily checked with independently verifiable data, facts, and sources that all say: you're wrong.
Please be more clear, there are two ways to interpret what you've written here. I'm going to assume that you mean users acting contrary to a social media platforms rules does not shield said platform from scrutiny or accountability, because the only other way to interpret that sentence is that the platform is acting contrary to the platforms rules and that just doesn't make any sense at all.
So going with the assumption that we're talking about users on a platform, actually it does as far as the actions of those users are concerned. If the platform itself does something to violate the law, then no, it's not shielded. But if the users do something, then yes, the law states you can't hold the platform liable. For the same reason you can't hold the owner of a house liable if two of his guests get into an argument and one assaults or kills the other. Again, I suggest reading the law, it's QUITE clear on this point.
[ link to this | view in thread ]
Re: Re: Re: Re:
So, does that mean you find the tool industry completely unacceptable because they regularly profit from human misery, since their tools are routinely used to bash in someone's head with a hammer, cut off a person's head with a hacksaw, steal cars with a crowbar, break into homes with a screwdriver, set fire to a house with a portable torch, and other acts with tools that cause human misery on a daily basis?
[ link to this | view in thread ]
Re:
Are you talking about the live stream? Facebook didn't even know about it until after the live stream ended BECAUSE NO ONE REPORTED IT TO THEM.
If you're talking about the copies that spread after it ended, this blatantly false as evidenced by the fact that Facebook engaged for days in an endless game of whac-a-mole removing it wherever they found it on their site. Same goes for Youtube and some of the other sites it was uploaded to.
You might want to check your facts before you go off like that.
[ link to this | view in thread ]
Re:
Then maybe we should find better ways of moderation and accept that bad people do bad things and you're never NOT going to be exposed to it.
So you're for government censorship then and want to ditch the first amendment? Because that's the only way to make that happen.
Well, moderation at that scale is nearly impossible to get right. Should they have better working conditions? Absolutely. But maybe we should all stop getting our panties in a twist because we saw something we didn't like and subsequently tried to force a third party to do something about it when we should be trying to go after the root cause.
No, he makes a stupid point. All that content is being uploaded by individuals. If you say you can't upload that much content per minute, you have now just infringed on individuals right to free speech, creation, and expression. The only way to not allow that much content to be uploaded at that rate, is for the government to dictate what people can and cannot do online. I'm sorry but that's completely stupid.
Welcome to the human race, a ludicrous, impossible-to-moderate-at-scale-free-for-all social network and visual space, where bad people will always find a way to do bad things to other people using any and all available tools at their disposal. What you want is censorship and loss of freedom so that you don't have to see anything bad or offensive. That's not the real world and can never work.
Besides that, if you do get something much smaller, then you have now turned back communication, arts, technology, expression, and jobs about 20-30 years and put a cap on progress.
[ link to this | view in thread ]
Re: Re:
Be gone you foul and disgusting human being.
[ link to this | view in thread ]
Re: Re: Re:
How surprising he ran away again...
[ link to this | view in thread ]
Re: A lot of snowflakes here
You sound like blues crazier cousin
[ link to this | view in thread ]
Re: Re:
Methinks you have a overfull nappy.
[ link to this | view in thread ]
"No, he makes a stupid point. All that content is being uploaded by individuals. If you say you can't upload that much content per minute, you have now just infringed on individuals right to free speech, creation, and expression. The only way to not allow that much content to be uploaded at that rate, is for the government to dictate what people can and cannot do online. I'm sorry but that's completely stupid."
Because multiple sites, each of which has a manageable amount of content, couldn't possibly exist, as that would threaten Google's practical monopoly in the viral-video space?
[ link to this | view in thread ]
Re: Re: Re: Re: Show me an idiot-proof realworld analog
You were replying to the wrong person and didn't acknowledge your own obsession causing you to see "Jhon" all over the place.
[ link to this | view in thread ]
Re: Re:
"you see social media as just a tool, without morality, without responsibility, without accountability."
Wait a sec .... my tools are supposed to have morals?
How are my tools supposed to be responsible? ... wtf?
Accountability?
It sounds as though you want the companies that make tools to exhibit all these traits ... right? Because certainly you realize that an inanimate object is incapable of having human traits.
"It's about time we treated them for what they are, companies that make part of their profit from"
[ link to this | view in thread ]
Re: Re:
So you approve that this snuff film is shown, that we should allow the mentally weak to be desensitised to suffering to death. What, do you think the world needs more copycats? Facebook has been working the demographics of it subscribers for years, if they were to allow this sort of message in there system, like minded people would be barraged with this content, then the only thing the rest of us could do is bet where the next Slaughter would be.
[ link to this | view in thread ]
Re:
Moderating all the output of the human race is not possible, whether you have a few big sites, or lots of little sites, you need the same number of moderators, as the problem is dealing with the volume of output that the human race can generate, and employing enough people as moderators to deal with that flow.
[ link to this | view in thread ]
Re:
By that logic, SPAM shouldn't be regulated.
[ link to this | view in thread ]
Re: Re:
You can regulate it by making people pay for the cost of their "output." That's the logic behind anti-SPAM rules and unsolicited faxes and texts.
[ link to this | view in thread ]
Re: Re:
It's not FREE speech if it costs money to moderate.
Let the users who overwhelm the content system pay for its moderation. They are not censored. You can even offer X amount of content tax-free, or waive the tax for the indigent if you want to be that principled.
[ link to this | view in thread ]
Re: Re:
The opposite side of that coin are people who feel empowered because they found like minds online and confuse that with mainstreaming.
[ link to this | view in thread ]
Re: Show me an idiot-proof realworld analog
Opioid makers are settling lawsuits with states left and right.
Carmakers are not required to control their automobiles on the highway. Internet providers have the means to do this with relative ease. If too much content is being uploaded, then tax it. It costs $55 to register a copyright in the US, but why is that not free? Because the LOC would be overwhelmed, as would PACER if they eliminated that fee.
[ link to this | view in thread ]
Re: Too Big, Should Fail!
Tax those who flood the system with content.
There. No censorship.
[ link to this | view in thread ]
And you will see far less output because creating a “free expression fee” will stop a hell of a lot of people from speaking their mind, artists included. No thanks.
[ link to this | view in thread ]
Two things.
Define “manageable amount of content” in objective, one-size-fits-all-websites terms.
[ link to this | view in thread ]
Unless people cannot afford to pay the tax, in which case their voice is silenced. Censorship.
[ link to this | view in thread ]
Re: Re: Re:
That way you start a bidding war for people to get their ideas in front of the public, and with no guarantee those ideas will not cause violence and disruption of society. Calvin, Luther, Marx to name a few, were people in a position to have their words published in eras where few people has that privilege.
[ link to this | view in thread ]
Re: Re: Re:
Look at how well that works in politics, where those with money decide what the law should be.
[ link to this | view in thread ]
The question of who would instigate and regulate this tax also comes into play. Last time I checked, the government is not in the business of deciding whose speech gets to be seen.
[ link to this | view in thread ]
Re: Re: Re:
Please show consideration to your fellow posters and clean up the excess straw after you're done stuffing your strawman with it.
[ link to this | view in thread ]
You first
I'm seeing at least eight posts from you in this comment section alone, that'll be $4, to be donated to TD for hosting your content under the 'You want to post, you got to pay' system you're proposing.
If your input is really worth posting I'm sure you'll have no problem paying the content uploading tax, unless of course the system you want to foist on others shouldn't apply to you, but as that would be grossly hypocritical I'm sure that won't be the case and you'll be making your donation quickly.
[ link to this | view in thread ]
Re: Re:
You do know what courier means don’t you?
[ link to this | view in thread ]
Re: Re:
NZ isn’t treating gun manufacturers any particular way. They are treating their products like the dangerous weapons they are. Besides
[ link to this | view in thread ]
Re: Re: Re:
Besides since there are no native gun manufacturers in NZ this is mostly an import export issue. And I’m sure you’re not stupid enough to argue that NZ isn’t allowed to control what comes over their own borders... are you?
[ link to this | view in thread ]
Re: Re: Show me an idiot-proof realworld analog
"Carmakers are not required to control their automobiles on the highway. Internet providers have the means to do this with relative ease."
Interesting ... internet providers have the capability to control vehicles on the highway. I was unaware of this development, when did this occur and how was it approved by the state ... and which state was it?
[ link to this | view in thread ]
Re: Re: Re:
There is more than one kind of free.
Free speech refers to the freedom type of free speech, not the free as in beer type.
[ link to this | view in thread ]
Re: Re:
As if it is now?
[ link to this | view in thread ]
Re: A lot of snowflakes here
Obvious troll is obvious.
[ link to this | view in thread ]
Re: Re: Re:
"It's not FREE speech if it costs money to moderate."
Come back once you've learn how words can have different meanings referring to completely different concepts.
[ link to this | view in thread ]
Re: Re:
"They are a for profit courier service that knowingly promotes their services to the extremes of society."
The same way crowbar manufacturers are leading suppliers to burglars, you mean?
Bobmail, do the words dual-use technology even register in that shrunk prune you tote around between your ears?
[ link to this | view in thread ]
Re: Re:
"Here we have the crux of the matter, you see social media as just a tool, without morality, without responsibility, without accountability."
Because that's what they are, and have been ever since the first upright-walking ape chose to bear a message for another.
"It's about time we treated them for what they are, companies that make part of their profit from the misery of others."
Nope. It's time that YOU go back to school and learn a bit about the role of communications and messenger services in a democratic society.
Or perhaps move to north korea because not even China and Russia will sign to the bullshit you keep spouting.
[ link to this | view in thread ]
Re: Re: Re:
"Please cite how Facebook is marketing to nazi's?"
I think he must be assuming that if they're marketing for him they'll market to absolutely any form of scum, no matter how vile.
I can't even fault him for that argument.
[ link to this | view in thread ]
Re: Re: Re:
"You do know what courier means don’t you?"
Judging by his consistent arguments around the issue i'd say that yes. Yes he does. It's the messenger, whom he wants shot.
[ link to this | view in thread ]
Re: Re: Re:
...and that's the main issue with modern society. When we see evidence about heinous wrong-doing our first response isn't "Holy cr*p, someone should do something about that vile scumbag!".
It's become "Holy cr*p, someone needs to take this down stat, so I don't have to know this shit is happening!".
Baghdad Bob/Bobmail/Blue, needless to say from his prior argumentation, is a staunch supporter of the second view.
[ link to this | view in thread ]
Re: Too Big, Should Fail!
"Use the FCC model prior to Micheal Powell, local content rules, break ownership up if they cross regions or have more properties in said region. For Facebook it would mean not owning instagram, not having news feeds that go across state lines, content derived locally. National content comes from approved trusted sources."
So basically roll back modern communications technology to 1960?
[ link to this | view in thread ]
Re: Re: Show me an idiot-proof realworld analog
"I assume by that you don't run antivirus software either."
The same way he doesn't drink because dihydrogen monoxide is toxic and doesn't breathe because oxygen is a dangerous oxidizer?
I believe he was commenting on the phenomenon of "dual-use".
That little phrase we who live in the real world use to describe concepts such as people being able to use a hammer on the head of a nail or the head of a fellow human being with roughly equal facility.
I don't know whether "aerinai" uses or does not use antiviral software because his arguments indicate neither.
YOUR arguments, however, indicate that you refuse to drink water, which people all too often use to murder other people with.
[ link to this | view in thread ]
Re: Re: Show me an idiot-proof realworld analog
"Internet providers have the means to do this with relative ease."
Under the same necessary assumptions that would enable car manufacturers to control cars, yes.
So let me get this straight, Baghdad Bob - are we back to the argument you kept harping on back on Torrentfreak years ago where you thought Internet providers had a duty to monitor all of their customers and cut them off if something...offensive...showed up in what should have been confidential communication?
[ link to this | view in thread ]
Re:
"You posit the question “What do we want our social media networks to be?”. My answer is “Something much smaller and actually manageable rather than the ludicrous, impossible-to-moderate-at-scale-free-for-all social networks and video sites that we have now.”"
So essentially you mean "freedom of speech" is a bad idea and should get shitcanned.
Your answer, basically, is what I'd expect to hear from some 18th-century time traveler outraged that the comfy small publications and Old Boys Gossip Network of his heyday have gone the way of the dodo.
[ link to this | view in thread ]
Re: Re: Re:
"Methinks you have a overfull nappy."
Methinks his nappy is pristine - he emptied his bowels on this thread instead, after all...
[ link to this | view in thread ]
Re: Re: Re:
"The opposite side of that coin are people who feel empowered because they found like minds online and confuse that with mainstreaming."
Ah, you mean networks like "Der Stürmer" and "Breitbart".
They already exist so there's no need at all to change anything on that account.
Unless your actual argument is that we need to change things because these minorities feel outraged that the "mainstream" as a whole insists on bringing to their notice that they are a small minority group not accepted by the general citizenry?
[ link to this | view in thread ]
Re: Re: Re:
"It's not FREE speech if it costs money to moderate."
You are - literally - an idiot, aren't you?
If the user had to pay the people censoring him now THAT would be un-free speech indeed.
Not to mention that once your suggestion is implemented you have given whatever minority is eager and keen to censor other people's opinion a free reign.
Good going there, Baghdad bob. You've actually provided an argument that we here on TD shouldn't just be able to flag your offensive commentary to keep it hidden - we should be able to remove you completely, and get paid doing so.
This somehow doesn't reconcile well with your frequent rants about "censorship"...but after all these years of reading your confused rants I'm not even surprised when you offer to self-destruct every prior argument you've offered in a fit of "Because ME!".
[ link to this | view in thread ]
Re:
"Because multiple sites, each of which has a manageable amount of content, couldn't possibly exist, as that would threaten Google's practical monopoly in the viral-video space?"
Uh, those sites already DO exist, obviously. There are thousands of social networks run by individuals.
But as usual...don't let factual reality get in the way of your false assumption and dishonest rhetoric, Baghdad Bob.
[ link to this | view in thread ]
Re: You first
"I'm seeing at least eight posts from you in this comment section alone, that'll be $4, to be donated to TD for hosting your content under the 'You want to post, you got to pay' system you're proposing. "
Hey, I flagged a lot of his posts as "stupid". That means, according to the "get paid to moderate" idea he had, he owes me a few bucks as well.
Since we can't bill an AC we should insist that he doesn't get to post at all anymore until he registers an account and settles his account.
[ link to this | view in thread ]
Re: Re: Show me an idiot-proof realworld analog
Objection your Honor! Relevance?
Because, quite rightly, they are not responsible for what users do with their vehicles on said highway after they obtain them from the manufacturer.
No, they really don't. Or have you not seen the massive number of false detections of offensive/infringing content all the detection systems generate?
Regardless of that, why should they be required to be responsible for things their users upload? Internet providers didn't upload the content, why should they be held responsible? There is literally no difference here, except maybe the internet providers are more akin to the highway than a car. Your argument is fundamentally flawed.
Define too much. Who gets to define what too much is? There is no such thing. To define it and subsequently tax it is censorship and a direct and flagrant violation of the First Amendment because it limits people's speech online.
Not to mention you ignore the many other use cases, such as families backing up their home videos to cloud storage systems like OneDrive and DropBox. People already pay for their internet access and those storage services. You really want to charge them A THIRD TIME for the amount they upload per minute? Get out.
Objection your Honor! Relevance! These are completely unrelated!
[ link to this | view in thread ]
Re: Re: Re:
That's a really great strawman you've constructed there, and lots of words you've shoved in my mouth. But I'm feeling a comeback coming on so let's get to it.
Actually, yes, in certain scenarios, I do approve of it being shown. It is a historical record now of a terrible act that was committed. It is useful to people such as historians and people who study human behavior. Not to mention there are likely clips from the video where the gunman is speaking that gives an insight into his motivation that could help us prevent things like this from happening again by confronting those issues.
Nowhere did I say it should be left up in its entirety for the public to see on public platforms. What I did say was that you are incorrect in your statement that no one at Facebook was willing to do anything about taking it down. They did, for days! And the only reason they didn't do it sooner was because no one reported it to them. Get your facts straight.
Perhaps you should start pointing fingers at Hollywood then? There's lots of crime TV shows and movies that show EXACTLY how to commit terrible acts of suffering to death.
Allow me to introduce you to Facebook's terms of service that explicitly state this type of content is NOT ALLOWED. Let me also point you to the example in this very case where Facebook spent DAYS trying to scrub their platform of this video.
Please check your strawman and rejection of reality at the door.
[ link to this | view in thread ]
Re: Re: Re:
That's either really stupid of you or a very pathetic strawman.
Free speech does not refer to the cost of speaking, it refers to the freedom and ability to say whatever you want without being told you can't say that. I suggest you read up on the subject.
This is a terrible idea and wouldn't solve anything. Throwing more money at it isn't going to make the problem go away because this is a human interaction problem, not an economic one.
Nor should they be. Freedom of speech and all that.
Again: FREEDOM. OF. SPEECH. It's kind of a law in the US that what you suggest is pretty much the number one thing the government is NOT allowed to do. The fact that you don't understand that is telling.
[ link to this | view in thread ]
Re:
Actually no, they can't.
But not because of this. See the many other video hosting sites currently available to set fire to your strawman.
They can't exist because there is no way to feasibly limit how much content gets uploaded to a platform without pretty much destroying that platform's ability to provide any kind of hosting services. Not to mention the fragmentation of having to manage multiple accounts across multiple platforms just so you can upload all the content you need/want.
Please educate yourself before you speak and make a fool out of yourself.
[ link to this | view in thread ]
Re: Re: Re:
This is called censorship and is prohibited by the First Amendment.
You obviously don't understand anti-spam rules then. They are free and don't force anyone to pay anything for sending spam.
Pfft. What time period do you come from, the 60s? I mean I guess there is a rare occasion that a company who still has a fax machine might get an unsolicited fax but realistically, that technology is all but dead. The average joe doesn't even have a fax machine.
Again, they don't really cost anything to send other then what it would cost to get your own cellphone. And there are some online services that let you do it for free.
So, where is the cost of all this again? And how is it not censorship by forcing people to pay for it?
[ link to this | view in thread ]
Re: Re: Re: Show me an idiot-proof realworld analog
Did you just describe a sociopath's wet dream?
[ link to this | view in thread ]
Re: Re: Re: Show me an idiot-proof realworld analog
"Opioid makers are settling lawsuits with states left and right.
Objection your Honor! Relevance?"
Poster may have been trying for some sort of equivalence where there is none. Opioid manufacturers have been implicated in kickback schemes, you should've seen the story some time ago. Car manufacturers were not bribing dealers to over prescribe vehicles to drivers were they?
[ link to this | view in thread ]
Re: Re: Re: Re:
"That way you start a bidding war for people to get their ideas in front of the public"
A soapbox is cheaper.
[ link to this | view in thread ]
Re: Re: Re:
Thank you for clearly labeling your strawman.
[ link to this | view in thread ]
Re:
But governments sure want the social media sites to do that for them, including mandated removal of certain types of speech.
[ link to this | view in thread ]
Re: Re:
Which is why it's so imperative that we call out governments when they try to tell all these platforms what they can and can't allow on their platforms, and hold them to the First Amendment.
[ link to this | view in thread ]
'I didn't mean apply those rules to ME!'
Good going there, Baghdad bob. You've actually provided an argument that we here on TD shouldn't just be able to flag your offensive commentary to keep it hidden - we should be able to remove you completely, and get paid doing so.
It's a source of endless amusement that the trolls infesting the site push for rules and/or laws that would, if actually applied, impact and silence them first.
Whether it's claiming that those that are 'rude' or do nothing but insult people should get the boot while doing nothing but that, or copy/pasting content from another source to defend a law that would make that act to risky to allow, their short-sighted hypocrisy truly knows no bounds.
[ link to this | view in thread ]
Re: Re: Re: Re: Show me an idiot-proof realworld analog
"Did you just describe a sociopath's wet dream?"
Well, I was describing a paradigm old Bobmail/Baghdad Bob/Blue has been advocating persistently and at length for years.
This may be a wet dream for sociopaths but given the tone Bobmail used when he was posting on it he was close to staining his trousers while awake. Not sure what that makes him.
[ link to this | view in thread ]
Re: 'I didn't mean apply those rules to ME!'
"It's a source of endless amusement that the trolls infesting the site push for rules and/or laws that would, if actually applied, impact and silence them first."
It is indeed. Although given the time I've seen these sock puppets post both here and on Torrentfreak I'm fairly convinced we're really just talking about that one guy who keeps trying to pretend it's not just him desperately rushing one sock puppet to the defense of his other when people have been too mean to it.
I keep saying that since his stated objective is commercial his first order of business should be to open a Patreon account or at least put out a hat. No clown should have to work unpaid, and his persistence in doing just that for years without a single dime received is a crying shame.
Unless some kind soul every now and then tosses 50 cents into his cubicle. That's always possible.
[ link to this | view in thread ]