Just delete any pictures with flannels jackets, hockey helmets and tim hortons cups in the shot that should be close enough.. Also don't let the door hit you on the way out
yeah wasn't trying to say it's just the system causing, trying to say it's not something we want to set up a system to encourage or make even worse than the natural tendencies start with.
Really the problem I see we have isn't "what" to filter out, it's how to identify it at scale. Whether you are trying to fact check stuff, or just check things to ideologically match what your platform wants you face the same problem.. how can actually get it done? Techdirt system works well at scale, but I think Wikipedia system also works pretty well at scale.
Kicking off people who violate the rules isn't an issue from my standpoint. You could set up your rules to say people can't lie too much or people must support my viewpoint, but either way you have to figure out who is breaking the rules and enforce them at scale.
From my viewpoint Trump was tolerated for years because he was the president and twitter felt like they didn't have the moral authority to just override the electorate when we was using it for presidential business without a better reason than just violating their TOS rules. The problem there was that you guys elected trump more than that twitter wasn't "controlling" him enough.
I agree that more should be done, and actually I think the result is the same.. Basically crack down harder on garbage. I just think the decision on what is garbage should try to be based on what lines up with objective reality rather than a particular philosophy.. In this case they are one and the same since the right wings have abandoned truth and reality completely
I don't think so, this is basically the system we have right now where a lot of the system encourages sticking to certain points of view instead of encouraging discussion and understanding and actually resolving anything..
What you are talking about, to me seems like trying to double down and have the platform not only encourage certain viewpoints but actively manipulate the discussion in certain directions, it's basically trying to influence the discussion towards whatever your platform wants or decides. You need to assume "platform knows best" in those cases, or even worse gov knows best if it comes from regs. Platform's motives vary wildly, you can't assume how they are going to use that manipulation.
I think of it like what would the same system look like in parler or in china. Could the same system be used to make the fanatics worse or pull the wool over everyone's eyes?
I fully admit It's a tough problem, but I don't think the answer is to switch from ref to arbiter.
I disagree that it's dangerous to entertain them as if they could be right.
The biggest trouble I have is suppressing that stuff systematically just lends it legitimacy. It just gives them a legitimate argument when they had none before.. They do have good arguments, but can't post them. "trust me, unlike other places, the system only ever suppresses the bad stuff here" just doesn't cut it for me.
I quite like the way techdirt handles things, but it's still subject to mob justice and encourages the protective wuss bubble phenom. The same system on parlor would flag anyone trying to point out simple contradictions in their conspiracy theories
I'm leaning more towards combining that with what twitter was doing, basically systematically keep track of viral fallacies, flag them as such and get them out of the way, but still have a way for anyone curious to find out why they did so with an opt in of some sort.
They are bullshit because they are unsupportable under scrutiny and people have to lie or resort to other things like spamming or other fallacies to support them. By definition their views wouldn't be prejudice if they weren't based in ignorance.
".and never be heard above the noise of anti-science morons proudly proclaiming that they know more than scientists because they watched a YouTube video"
by spamming, linking to a youtube video with a bunch of untrue debunked stuff in it or otherwise doing things that objective rules could handle.
We aren't talking about one opinion vs another here, we are talking about valid arguments on one side vs a bunch of argumentative fallacies and uncivil discourse on the other. You don't need to take sides about who is right, you just need to keep your discourse civil. Don't let people spam without making a real argument, don't let people get away with lying, linking to or spouting stuff that is thoroughly debunked just to try to drown out the face that it's thoroughly debunked
But thats because the Advocacy for extreme positions you are talking about is bullshit. If you were a Newton or Einstein of our time advocating for an extreme position that isn't based on bullshit, you wouldn't be overwhelming social media will bullshit you would be quietly stating truths and letting society absorb them until your position wasn't extreme anymore
I don't think expressing extreme views in and of itself is a problem, neither of those examples are disinformation. Saying you think gay people should be executed should be kept in check by being shunned by decent people everywhere. Being extreme doesn't make you right or wrong,
the disinformation problem comes when people's views are unsupportable in reality and they start lying to convince people, when person B tries to do anything to back up their position, they lie because they have nothing to back it up other than "just cuz".
When they paid or forged some pseudo scientist to put out a study saying that gay people rigged the election and stole it from trump and start spreading links around to that, or even just made up some bullshit story to support their position, thats what you need to get in front of
You've probably just blocked out the trauma to save your sanity.. Everything doesn't have to be perfect to be happy the mountain men from deliverance aren't in charge anymore
everyone can and does plainly see that a laugh track on a sit com is fake and it still works. laugh tracks work for the same reason that fiction works in general. The audience aren't being deceived, they are playing along.
Exactly, thats why they are in just a sad state right now.. The music and movie companies are just on the wrong side and just feel entitled to be forever paid for anyone seeing or hearing stuff they inherited or scammed off creative people back in the 60s..
The news orgs have a legitimate burden to bear, and less and less resources all the time.. They aren't exactly evil (like the riaa is), they just don't know what to do. Being dependent on google for what amounts to forced charity is barely a bandaid and isn't going to save them.. No one would link to them if they had to pay and were given the choice, you can't try to lock down facts, the news will be everywhere quickly anyway. They need to figure out a new legitimate model to support journalism without relying on being gatekeepers of the result
"I don't want elected officials telling what I can read or watch - that's Nazi Germany."
Instead I want elected officials telling other people that they have to read and watch what I want them to.
We never had all this censorship bullshit with the tv stations.. "I sent them a letter and they decided not to read it on the air.. wahh I've been censored!".. I don't know what kicked off this massive entitlement such that "tech companies are suppressing us", but I have a hard time believing it was completely above board. Is it really just that those people are that entitled?
Google tried to make a facebook competitor but it sucked, facebook doesn't really have any iron grip on anything though, nothing keeping competitors out and if the duckduckgo of facebooks comes along that provides a lot of the best features without all the baggage I think facebook would be in worse shape than google is to deal with them.
not at all. build it however you want. its just if bitching that amazon or some other of your "big tech free speech crushing enemies" else wont let you use what they made is your idea of building something yourself im not particularly impressed. go ahead and bitch about it all you want but dont expect sympathy from me
On the post: Canadian Privacy Commission Says Clearview's App Is Illegal, Tells It To Pack Its Things And Leave
Just delete any pictures with flannels jackets, hockey helmets and tim hortons cups in the shot that should be close enough.. Also don't let the door hit you on the way out
On the post: Without Twitter, Trump Is Left To Write Tweets He Would Have Said On Paper
Maybe he just needs a place to vent his frustrations and was only using twitter as catharsis, and destroying democracy was just a side effect
On the post: Can A Community Approach To Disinformation Help Twitter?
Re: Re: Re: Re: Re: Re: Re: Re: Re:
yeah wasn't trying to say it's just the system causing, trying to say it's not something we want to set up a system to encourage or make even worse than the natural tendencies start with.
Really the problem I see we have isn't "what" to filter out, it's how to identify it at scale. Whether you are trying to fact check stuff, or just check things to ideologically match what your platform wants you face the same problem.. how can actually get it done? Techdirt system works well at scale, but I think Wikipedia system also works pretty well at scale.
Kicking off people who violate the rules isn't an issue from my standpoint. You could set up your rules to say people can't lie too much or people must support my viewpoint, but either way you have to figure out who is breaking the rules and enforce them at scale.
From my viewpoint Trump was tolerated for years because he was the president and twitter felt like they didn't have the moral authority to just override the electorate when we was using it for presidential business without a better reason than just violating their TOS rules. The problem there was that you guys elected trump more than that twitter wasn't "controlling" him enough.
I agree that more should be done, and actually I think the result is the same.. Basically crack down harder on garbage. I just think the decision on what is garbage should try to be based on what lines up with objective reality rather than a particular philosophy.. In this case they are one and the same since the right wings have abandoned truth and reality completely
On the post: Can A Community Approach To Disinformation Help Twitter?
Re: Re: Re: Re: Re: Re: Re:
I don't think so, this is basically the system we have right now where a lot of the system encourages sticking to certain points of view instead of encouraging discussion and understanding and actually resolving anything..
What you are talking about, to me seems like trying to double down and have the platform not only encourage certain viewpoints but actively manipulate the discussion in certain directions, it's basically trying to influence the discussion towards whatever your platform wants or decides. You need to assume "platform knows best" in those cases, or even worse gov knows best if it comes from regs. Platform's motives vary wildly, you can't assume how they are going to use that manipulation.
I think of it like what would the same system look like in parler or in china. Could the same system be used to make the fanatics worse or pull the wool over everyone's eyes?
On the post: Can A Community Approach To Disinformation Help Twitter?
Re: Re: Re: Re: Re:
I fully admit It's a tough problem, but I don't think the answer is to switch from ref to arbiter.
I disagree that it's dangerous to entertain them as if they could be right.
The biggest trouble I have is suppressing that stuff systematically just lends it legitimacy. It just gives them a legitimate argument when they had none before.. They do have good arguments, but can't post them. "trust me, unlike other places, the system only ever suppresses the bad stuff here" just doesn't cut it for me.
I quite like the way techdirt handles things, but it's still subject to mob justice and encourages the protective wuss bubble phenom. The same system on parlor would flag anyone trying to point out simple contradictions in their conspiracy theories
I'm leaning more towards combining that with what twitter was doing, basically systematically keep track of viral fallacies, flag them as such and get them out of the way, but still have a way for anyone curious to find out why they did so with an opt in of some sort.
On the post: Can A Community Approach To Disinformation Help Twitter?
Re: Re: Re:
They are bullshit because they are unsupportable under scrutiny and people have to lie or resort to other things like spamming or other fallacies to support them. By definition their views wouldn't be prejudice if they weren't based in ignorance.
".and never be heard above the noise of anti-science morons proudly proclaiming that they know more than scientists because they watched a YouTube video"
by spamming, linking to a youtube video with a bunch of untrue debunked stuff in it or otherwise doing things that objective rules could handle.
We aren't talking about one opinion vs another here, we are talking about valid arguments on one side vs a bunch of argumentative fallacies and uncivil discourse on the other. You don't need to take sides about who is right, you just need to keep your discourse civil. Don't let people spam without making a real argument, don't let people get away with lying, linking to or spouting stuff that is thoroughly debunked just to try to drown out the face that it's thoroughly debunked
On the post: Can A Community Approach To Disinformation Help Twitter?
Re:
But thats because the Advocacy for extreme positions you are talking about is bullshit. If you were a Newton or Einstein of our time advocating for an extreme position that isn't based on bullshit, you wouldn't be overwhelming social media will bullshit you would be quietly stating truths and letting society absorb them until your position wasn't extreme anymore
On the post: Can A Community Approach To Disinformation Help Twitter?
Re:
I don't think expressing extreme views in and of itself is a problem, neither of those examples are disinformation. Saying you think gay people should be executed should be kept in check by being shunned by decent people everywhere. Being extreme doesn't make you right or wrong,
the disinformation problem comes when people's views are unsupportable in reality and they start lying to convince people, when person B tries to do anything to back up their position, they lie because they have nothing to back it up other than "just cuz".
When they paid or forged some pseudo scientist to put out a study saying that gay people rigged the election and stole it from trump and start spreading links around to that, or even just made up some bullshit story to support their position, thats what you need to get in front of
On the post: Can A Community Approach To Disinformation Help Twitter?
isn't a community approach to disinformation kind of the problem?
On the post: Now It's The Democrats Turn To Destroy The Open Internet: Mark Warner's 230 Reform Bill Is A Dumpster Fire Of Cluelessness
Re: Honeymoon period
You've probably just blocked out the trauma to save your sanity.. Everything doesn't have to be perfect to be happy the mountain men from deliverance aren't in charge anymore
On the post: Huawei Attempts To Rebuild Trust By Using... Fake Twitter Telecom Experts
Re: Re:
I remember when they did that, there were little blue fairies and unicorns there it was awesome.. Too bad no one took any pictures.
On the post: Utah Theme Park Sues Taylor Swift Over Album Title After Exploiting It
Re: nEvermore
quoth the raven.. 600$ a head please!
On the post: Huawei Attempts To Rebuild Trust By Using... Fake Twitter Telecom Experts
Re:
everyone can and does plainly see that a laugh track on a sit com is fake and it still works. laugh tracks work for the same reason that fiction works in general. The audience aren't being deceived, they are playing along.
On the post: Microsoft Offers To Break The Web In A Desperate Attempt To Get Somebody To Use Its Widely-Ignored Bing Search Engine
Re: Re: Re:
Exactly, thats why they are in just a sad state right now.. The music and movie companies are just on the wrong side and just feel entitled to be forever paid for anyone seeing or hearing stuff they inherited or scammed off creative people back in the 60s..
The news orgs have a legitimate burden to bear, and less and less resources all the time.. They aren't exactly evil (like the riaa is), they just don't know what to do. Being dependent on google for what amounts to forced charity is barely a bandaid and isn't going to save them.. No one would link to them if they had to pay and were given the choice, you can't try to lock down facts, the news will be everywhere quickly anyway. They need to figure out a new legitimate model to support journalism without relying on being gatekeepers of the result
On the post: Various States All Pile On To Push Blatantly Unconstitutional Laws That Say Social Media Can't Moderate
Re: Re:
"I don't want elected officials telling what I can read or watch - that's Nazi Germany."
Instead I want elected officials telling other people that they have to read and watch what I want them to.
On the post: We're Living Our Lives On The Internet, And We Can't Be Free If It Isn't.
Re: Re:
We never had all this censorship bullshit with the tv stations.. "I sent them a letter and they decided not to read it on the air.. wahh I've been censored!".. I don't know what kicked off this massive entitlement such that "tech companies are suppressing us", but I have a hard time believing it was completely above board. Is it really just that those people are that entitled?
On the post: Microsoft Offers To Break The Web In A Desperate Attempt To Get Somebody To Use Its Widely-Ignored Bing Search Engine
Re:
duckduckgo is really impressive.
Google tried to make a facebook competitor but it sucked, facebook doesn't really have any iron grip on anything though, nothing keeping competitors out and if the duckduckgo of facebooks comes along that provides a lot of the best features without all the baggage I think facebook would be in worse shape than google is to deal with them.
On the post: Huawei Attempts To Rebuild Trust By Using... Fake Twitter Telecom Experts
"US companies do it fairly routinely to generate fake buzz and shape the discourse on social media platforms"
I don't think they generally go the extra mile and create actual faces for their fake people and take pictures of them though.
In my opinion this sort of thing should be considered some form of fraud
On the post: Various States All Pile On To Push Blatantly Unconstitutional Laws That Say Social Media Can't Moderate
Re: Re: Re: Re:
not at all. build it however you want. its just if bitching that amazon or some other of your "big tech free speech crushing enemies" else wont let you use what they made is your idea of building something yourself im not particularly impressed. go ahead and bitch about it all you want but dont expect sympathy from me
On the post: Various States All Pile On To Push Blatantly Unconstitutional Laws That Say Social Media Can't Moderate
Re: Re: Re: Re: Re: Re: 'Put up or shut up'
Bullshit.
Next >>