It's Time to Talk About Internet Companies' Content Moderation Operations

from the transparency dept

As discussed in this post below, on February 2nd, Santa Clara University is hosting a gathering of tech platform companies to discuss how they actually handle content moderation questions. Many of the participants have written short essays about the questions that will be discussed at this event -- and over the next few weeks we'll be publishing many of those essays. This first one comes from Professor Eric Goldman, who put together the conference, explaining the rationale behind the event and this series of essays.

Many user-generated content (UGC) services aspire to build scalable businesses where usage and revenues grow without increasing headcount. Even with advances in automated filtering and artificial intelligence, this goal is not realistic. Large UGC databases require substantial human intervention to moderate anti-social and otherwise unwanted content and activities. Despite the often-misguided assumptions by policymakers, problematic content usually does not have flashing neon signs saying "FILTER ME!" Instead, humans must find and remove that content—especially with borderline cases, where machines can't make sufficiently nuanced judgments.

At the largest UGC services, the number of people working on content moderation is eye-popping. By 2018, YouTube will have 10,000 people on its "trust & safety teams." Facebook's "safety and security team" will grow to 20,000 people in 2018.

Who are these people? What exactly do they do? How are they trained? Who sets the policies about what content the service considers acceptable?

We have surprisingly few answers to these questions. Occasionally, companies have discussed these topics in closed-door events, but very little of this information has been made public.

This silence is unfortunate. A UGC service's decision to publish or remove content can have substantial implications for individuals and the community, yet we lack the information to understand how those decisions are made and by whom. Furthermore, the silence has inhibited the development of industry-wide "best practices." UGC services can learn a lot from each other—if they start sharing information publicly.

On Friday, a conference called "Content Moderation and Removal at Scale" will take place at Santa Clara University. (The conference is sold out, but we will post recordings of the proceedings, and we hope to make a live-stream available). Ten UGC services will present "facts and figures" about their content moderation operations, and five panels will discuss cutting-edge content moderation issues. For some services, this conference will be the first time they've publicly revealed details about their content moderation operations. Ideally, the conference will end the industry's norm of silence.

In anticipation of the conference, we assembled ten essays from conference speakers discussing various aspects of content moderation. These essays provide a sample of the conversation we anticipate at the conference. Expect to hear a lot more about content moderation operational issues in the coming months and years.

Eric Goldman is a Professor of Law, and Co-Director of the High Tech Law Institute, at Santa Clara University School of Law. He has researched and taught Internet Law for over 20 years, and he blogs on the topic at the Technology & Marketing Law Blog.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: companies, content moderation, filtering, intermediary liability, internet platforms, moderation


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Pixelation, 29 Jan 2018 @ 1:36pm

    I will discuss this, but only in moderation.

    link to this | view in thread ]

  2. icon
    Designerfx (profile), 29 Jan 2018 @ 1:41pm

    I received an app to do moderation jobs through lionbridge

    I know nothing about it aside from they tried to pay people very poorly to basically apply subjective rules to google search. One position was called social media internet assessor (aka moderator), and another was explicitly focused on mobile search. Both were focused on accuracy, not speed - which makes sense given the number of people they must be hiring.

    That being said, it's subjective accuracy and was basically relying on human tuning and they were clearly targeting the ability to pay people a very low income to moderate.

    link to this | view in thread ]

  3. identicon
    Jordan Chandler, 29 Jan 2018 @ 2:09pm

    Re: I received an app to do moderation jobs through lionbridge

    Zuckerberg didn't get rich being a generous employee

    link to this | view in thread ]

  4. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 29 Jan 2018 @ 2:21pm

    "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    When you get a BASIC fact wrong, the rest can't be right. -- And yeah, goes on as below to include "remove", but at best it's lousy phrasing.

    "A UGC service's decision to publish or remove content can have substantial implications for individuals and the community,"

    Now, The Masnick often sez that "platforms" such as mega-corporations Facebook, Google, and Twitter, have a First Amendment Right to arbitrarily control the speech of "natural" persons, even to stop the "service". The second tier of weenies say persons will just have to find alternatives, but separate in some tiny venue is clearly not equal. So I say that's un-American and that these supra-national corporations need good clear cause under common law to do ANY regulating. -- Sure, for a while they'll pick targets vaguely justified under common law, but soon as everyone is used to speech controlled by corporations, it'll be used against anyone not a rabid globalist.

    "yet we lack the information to understand how those decisions are made and by whom." -- Maybe you lack such, but I can already clearly see Google's and Facebook's, and Twitter's corporate philosophy in operation. Only point in question is how much they hide it. They're getting eager to censor, and are testing it with such as ad bans to Infowars and antiwar.com. -- Just last week Sean Hannity's Twitter account was closed. They now claim was some mistake or glitch, but if one doesn't have millions of followers to object, then there'd be not even a murmur when persons are shut out of "platforms". You can't complain publicly when your means of access is taken away.

    And proving that this is just more of the same nattering is that there's nothing solid, just announcing yet another "conference" where self-aggrandizing weenies (who believe themselves "insiders") will variously gloss over facts and hide the ongoing censorship by globalist corporations of "conservative" and pro-American persons.

    link to this | view in thread ]

  5. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 29 Jan 2018 @ 2:25pm

    Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    Dang "Markdown". I should have split the second italicized para at the double hyphens.

    BTW: I note the the interim limit on length of both subject line and body seem to be removed, so since generally more convenient for all, I'm back to prior practice.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 29 Jan 2018 @ 2:31pm

    Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    You are wrong, as usual.

    link to this | view in thread ]

  7. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 29 Jan 2018 @ 2:38pm

    Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    > "You are wrong, as usual."

    You're far worse than wrong, you're INEFFECTIVE.

    This is a typical AC one-liner -- probably by a Techdirt "administrator" -- unable to think of more substance than gainsaying.

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 29 Jan 2018 @ 2:39pm

    Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    When you keep repeating "natural person" and "common law" like a voodoo protection spell it really undermines attempts to make a point.

    Will you next start claiming that a fringe on the flag means all judges are presiding over maritime court and therefore have no jurisdiction? Are you a sovereign ciizen who doesn't need a driver's license and can write off your debts with the secret bank account encoded in your birth certificate.

    link to this | view in thread ]

  9. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 29 Jan 2018 @ 2:44pm

    Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    > "When you keep repeating "natural person" and "common law" like a voodoo protection spell it really undermines attempts to make a point."

    Oh, it's likely "Stephen T Stone", then -- and AS IF you want me to make a point!

    Do you even grasp that lawyers invented "corporate" persons out of the blue? They've forced use of the term "natural", not me.

    Those who understand will appreciate the accuracy, while those who oppose the Constitution will try to minimize use of the terms.

    link to this | view in thread ]

  10. icon
    That Anonymous Coward (profile), 29 Jan 2018 @ 2:54pm

    They will defend not sharing their secret sauce with the claims if we explain it, people will bypass it.
    Ignoring the history of people will figure out the black box, bypass it, and another round of secrets will be stuffed in the box in an unending spiral of we can fix this, if only we spend more!!!!!!

    They face pressure from politicians, & 'well meaning' groups who want to wrap the world in nerf & pretend bad things don't exist if we can hide them. Corporations are now responsible for what children see & must be punished if they fail to protect 'the children'!!

    I didn't get weird because I saw the word nazi online. I've said some horrible cutting things & used bad words, thankfully I've avoided the Moral Moderation League so far. Heh I've triggered moderation here a couple times, but magically it isn't for my content its for how it looked to code... and its reviewed & approved quickly. They are trying to stop spammers & the trade off is sometimes if I sound like a bot I get caught too.

    We can not keep blaming Google, FB, Twitter, etc. because you saw something that offended you. The internet is vast & its still going to be out there, somewhere. Politicians of a certain age believe Google = Internet. While they are large, they aren't the entire internet.

    No code can replace human review, but humans can only handle so much. Without clear training you end up with silly enforcement. See Also: A verified Twitter account can say things worse than unverified & not get the same timeout.

    We need to admit, there is bad shit out there in the world.
    It is OUR responsibility to prepare ourselves & kids for seeing it rather than just expect someone else once again can be legislated to do the heavy lifting for us.

    Oooh and because filters are fun...
    Has anyone Seen Kyle? He's about this tall! Have you Seen Kyle? (if you aren't laughing, say it outloud or google have you seen kyle).

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 29 Jan 2018 @ 3:08pm

    Re: Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    The fact that you used your own name without irony pretty much sums you up blue.

    link to this | view in thread ]

  12. identicon
    Anonymous Coward, 29 Jan 2018 @ 3:09pm

    Re: Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    You didn’t answer the question.

    link to this | view in thread ]

  13. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 29 Jan 2018 @ 3:27pm

    Re: Re: Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    > "You didn’t answer the question."

    So who are you, AC, to assert a a rule here is that all questions must be answered? (Including this one.)

    Yet more off-topic one-liners all you have? (It's likely the same one AC with magic of Tor Browser.)

    Next you'll be typing chicken noises.

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 29 Jan 2018 @ 3:42pm

    Re: Re: Re: Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    I’m the goddamn Batman.

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 29 Jan 2018 @ 4:14pm

    Re: Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    k, keep yelling at clouds.

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 29 Jan 2018 @ 4:15pm

    Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    > Will you next start claiming that a fringe on the flag means all judges are presiding over maritime court and therefore have no jurisdiction? Are you a sovereign ciizen who doesn't need a driver's license and can write off your debts with the secret bank account encoded in your birth certificate.

    hey, cool info, ac! want to use this myself, so would you state what's true, and give a really authortative source?

    or are you just making stuff up?

    link to this | view in thread ]

  17. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 29 Jan 2018 @ 4:31pm

    Re: Re: Re: Re: Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE und

    > "I’m the goddamn Batman."

    No, not THE goddamn Batman! -- I think you're just
    a goddamn idiot. But thanks for showing the heights of Techdirt discourse.

    link to this | view in thread ]

  18. icon
    Stephen T. Stone (profile), 29 Jan 2018 @ 4:36pm

    Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    I am not going to do a reply to each of your individual posts because of course not. Consider this a catch-all for all of your bullshit in this entire conversation thread up to now.

    "platforms" such as mega-corporations Facebook, Google, and Twitter, have a First Amendment Right to arbitrarily control the speech of "natural" persons, even to stop the "service".

    They do. They absolutely do. Google could literally shut down YouTube tomorrow at noon and nobody outside of Google’s management team could stop them.

    (Also: “Natural persons” is not a magic catchphrase, Mr. SovCit. You do not strengthen your argument by using that or other SovCit lingo as if it means something to anyone other than you.)

    The second tier of weenies say persons will just have to find alternatives, but separate in some tiny venue is clearly not equal.

    That is correct. If a service boots you from its platform or shuts down, you will have to find an alternative. You cannot force a platform to either host your speech or remain active.

    I say that's un-American

    Given how the American socioeconomic system has allowed a handful of major corporations to control the production and distribution of nearly all of our media, calling this system “un-American” seems dishonest.

    these supra-national corporations need good clear cause under common law to do ANY regulating.

    Again: SovCit buzzwords that you either cannot or will not explain or define do not bolster your argument. As for the “good clear cause” issue, I would think the right of a company to “regulate” speech on any platform it owns is cause enough.

    for a while they'll pick targets vaguely justified under common law, but soon as everyone is used to speech controlled by corporations, it'll be used against anyone not a rabid globalist.

    You mean like the people who get suspended from Twitter because they dared to snap back at a harasser and got reported en masse by that harasser’s followers? Or the queer people who had their videos blocked within YouTube search simply because they had LGBT-friendly titles/content? Corporations already fuck this up. There is no perfect moderation. That said, we can accept the imperfection of Internet moderation while working to improve it.

    (And for a third time: “Globalist”, like “natural person” and “common law”, is not the magic word you think it is.)

    They're getting eager to censor, and are testing it with such as ad bans to Infowars and antiwar.com.

    Question! Do you believe companies such as Google should be forced to host advertisements from anyone with the cash to afford an ad spot?

    Just last week Sean Hannity's Twitter account was closed. They now claim was some mistake or glitch, but if one doesn't have millions of followers to object, then there'd be not even a murmur when persons are shut out of "platforms". You can't complain publicly when your means of access is taken away.

    Uh, two things: Twitter reported that his account was legitimately hacked, and Sean Hannity has a television show on which he can complain. And yes, you cannot complain publicly on that platform, but there are plenty of other platforms out there to use. If Twitter dumps you, you can use given Mastodon or GNU Social instance, Tumblr, YouTube, Blogger, Ello, and any other site that allows for publicly-viewable posts to complain how Twitter silenced all of your speech forever.

    just announcing yet another "conference"

    So what?

    Do you even grasp that lawyers invented "corporate" persons out of the blue?

    The average person calls those things “corporations”. Lawyers, reporters, and SovCits are the only assholes who use the phrase “corporate persons” with sincerity. To which one of the three groups do you belong?

    link to this | view in thread ]

  19. identicon
    Anonymous Coward, 29 Jan 2018 @ 4:52pm

    I hope the discourse over moderation over the coming years uses more neutral language to describe the problems.

    "Racist Russian trolls are weaponizing freedom of speech to censor marginalized persons!"

    vs

    "Interest groups are drowning conversation threads on social media with off-topic or otherwise intentionally inflammatory rhetoric"


    If you want productive discussion, it starts with you. You *know* there's certain demographics that claim partisan social media management are censoring Conservatives. Why use rhetoric that's likely to provoke them rather than presenting the issue neutrally? You'll get less trolls that way.

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 29 Jan 2018 @ 5:11pm

    Re: Re: Re: Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    Chicken noises are the bread and butter of your anti-Masnick rallying boyfriend, average_joe/antidirt.

    Seriously, out_of_the_blue, let your fuck buddies up for oxygen every once in a while. Oxygen's not copyrighted, I promise there's no infringement.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 29 Jan 2018 @ 5:29pm

    Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    If you do not like the way the existing social media platforms are run, you are free to build one that follows your rule, and try to attract enough users for it to be a viable business.

    Demanding that others carry your speech is as much an affront to their freedom, as demanding that others censor their or others speech, as both demands are one person trying to force others to bend to their will.

    link to this | view in thread ]

  22. icon
    Stephen T. Stone (profile), 29 Jan 2018 @ 5:36pm

    Re:

    You know there's certain demographics that claim partisan social media management are censoring Conservatives. Why use rhetoric that's likely to provoke them rather than presenting the issue neutrally?

    Because those same people will weaponize a rhetorically-neutral response against you.

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 29 Jan 2018 @ 5:47pm

    I worked for the legal department of a large website hosting provider and handled many, many content moderation issues. There's no training. It's entirely the subjective and arbitrary interpretation of the person that handles the complaint. Sure, you can complain up the management chain, but even then it just gets to the point of "what course of action is least likely to cause public issues and/or legal action?"

    link to this | view in thread ]

  24. identicon
    Anonymous Coward, 29 Jan 2018 @ 6:00pm

    Re: Re:

    Because those same people will weaponize a rhetorically-neutral response against you.

    I am disappointed that you assume all who could possibly disagree are beyond reasonable discourse. The idea behind rhetorical neutrality isn't to eliminate hecklers - it's to reduce their numbers and to clearly present the problem.

    I've been seeing quite a lot of reasonablele complaints about absurd social media moderation in recent months. Enough complaints that I am predisposed to think partisan idealogues are using the cover of the "moderation" topic to moderate reasonable, non-inflammatory speech they don't like. I'm not saying this as a partisan idealogue - I don't care for tribal politics and I prefer focusing on issues.

    If you're serious about furthering what amounts to public policy discourse while minimizing hecklers and trolls, please keep inflammatory rhetoric to a minimum.

    If you insist on using inflammatory rhetoric, then you shouldn't be surprised when you get inflammatory responses.

    link to this | view in thread ]

  25. identicon
    Anonymous Coward, 29 Jan 2018 @ 6:10pm

    Re: Re: Re: Re: Re: Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE

    You’re just mad that I have gold fringe on my cape!

    link to this | view in thread ]

  26. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 29 Jan 2018 @ 6:19pm

    The "content moderation" movement attempts to legitimize prior restraint and censorship. Pure and simple. People who write articles like this, as if the movement were valid, attempt to legitimize said movement. TechDirt does so because they have a monetary alignment those with involved. Those involved are simply using the movement as cover for demonetization schemes because they are unable and unwilling to payout as before. You are not fooling anyone. Content moderation is censorship. You are backing censorship. All your spin is not going to change that. Attacking me for my words (apparently the only thing you know how to do) will not change that.

    link to this | view in thread ]

  27. identicon
    Anonymous Coward, 29 Jan 2018 @ 6:45pm

    Re:

    You made allegations about financial motivations on TehDirt's part.


    Care to substantiate these allegations?

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 29 Jan 2018 @ 6:46pm

    Re:

    "Both sides" concern trolling, what a great solution.

    link to this | view in thread ]

  29. identicon
    Anonymous Coward, 29 Jan 2018 @ 6:46pm

    Re:

    You don't know what prior restraint is.

    link to this | view in thread ]

  30. identicon
    Anonymous Coward, 29 Jan 2018 @ 6:58pm

    Re: Re:

    Terrible b8, m8. I r8 1/8.

    Respond to something I actually said next time.

    link to this | view in thread ]

  31. identicon
    Anonymous Coward, 29 Jan 2018 @ 7:00pm

    Re:

    Is that you, Hamilton? Started having dreams about this website again?

    link to this | view in thread ]

  32. icon
    Stephen T. Stone (profile), 29 Jan 2018 @ 7:30pm

    Re: Re: Re:

    I am disappointed that you assume all who could possibly disagree are beyond reasonable discourse.

    I assume anyone is capable of reasonable discourse unless they give me reason to think otherwise. Mr. SovCit, for example, does that when he uses SovCit voodoo phrases like “natural persons” and “common law” as if they are legitimate arguments.

    The idea behind rhetorical neutrality isn't to eliminate hecklers - it's to reduce their numbers and to clearly present the problem.

    Such arguments will not always be addressed by people who can and will argue in good faith. Rhetorical neutrality can still be weaponized against those who would use it. After all, which one sounds like a more effective phrasing for mainstream adoption: “family reunification” or “chain migration”? (Hint: The former is the actual language used in immigration laws; the latter is the phrasing adopted by conservatives and right-wingers.)

    I am predisposed to think partisan idealogues are using the cover of the "moderation" topic to moderate reasonable, non-inflammatory speech they don't like.

    Well, yeah. Dogmatic partisan thinking is a hell of a thing.

    If you're serious about furthering what amounts to public policy discourse while minimizing hecklers and trolls, please keep inflammatory rhetoric to a minimum.

    Or we can tell the trolls and hecklers to fuck off.

    link to this | view in thread ]

  33. icon
    Stephen T. Stone (profile), 29 Jan 2018 @ 7:38pm

    Re:

    Content moderation on a given platform is neither generalized nor governmental censorship. You can argue that it is censorship for that specific platform, but that is the best argument you can make in that regard. If Twitter bans you, dozens of other services, both free and paid, can and will host your speech—and the government, at least in the United States, cannot stop you from making use of those services.

    You cannot force a platform to host your speech. I have no idea why you would even want to try.

    link to this | view in thread ]

  34. identicon
    Anonymous Coward, 29 Jan 2018 @ 7:51pm

    Re:

    They aren’t attacking your words matey. They are attacking the obvious bullsshit you are spewing. Stop lying or grow a thicker skin.

    link to this | view in thread ]

  35. identicon
    Anonymous Coward, 29 Jan 2018 @ 8:27pm

    Re: Re: Re: Re:

    I assume anyone is capable of reasonable discourse unless they give me reason to think otherwise. Mr. SovCit, for example, does that when he uses SovCit voodoo phrases like “natural persons” and “common law” as if they are legitimate arguments.

    If you truly think anyone is capable of reasonable discourse, why provoke? I'm not saying the nut you're referring to is reasonable but that using inflammatory rhetoric poisons the well from the get-go and gives ammo to alt-right idealogues to recruit non-partisan trolls to join in their campaigns.

    It's unnecessary and escalates conflict for no discernable gain.

    Such arguments will not always be addressed by people who can and will argue in good faith. Rhetorical neutrality can still be weaponized against those who would use it. After all, which one sounds like a more effective phrasing for mainstream adoption: “family reunification” or “chain migration”? (Hint: The former is the actual language used in immigration laws; the latter is the phrasing adopted by conservatives and right-wingers.)

    I always found that framing to be of the "terrorists vs freedom fighters" variety. Both are emotional and easily co-opted by people looking to add their pet issue to the bogeyman list (see: "ecoterrorism" or most of the War On Terror). It obscures rational cost/benefit analysis and is mainly an excuse to gain mob support behind an issue that the Big Boys And Girls already decided behind closed doors.

    Well, yeah. Dogmatic partisan thinking is a hell of a thing.

    Yep. The recent Twitch fiasco where a woman whose mother is from Africa was banned with moderators calling her racist for an ethnic cooking video of actual food eaten in Africa, not racist caricatures. This woman is sending in her DNA test to Twitch to get this resolved.

    I find someone sending in a DNA test to a social platform to be absolutely absurd, yet it's the dogmatic political environment we're in with tech that led to this.

    Or we can tell the trolls and hecklers to fuck off.

    My disappointment remains. If this is an omen of how the debate will play out in the coming months, I expect it to devolve into little more than name calling.

    link to this | view in thread ]

  36. identicon
    Anonymous Coward, 29 Jan 2018 @ 8:45pm

    Re: Re: Re:

    I am disappointed that you assume all who could possibly disagree are beyond reasonable discourse.

    Possibly not. But from the ones who frequent this site?

    Pick one, from average_joe/antidirt, out_of_the_blue, Richard Bennett, Hamilton/Shiva Ayyadurai, MyNameHere/horse with no name/Just Sayin', darryl, Technopolitical, and Tara Carreon.

    We'll wait. Take as much time as you need.

    link to this | view in thread ]

  37. icon
    Stephen T. Stone (profile), 29 Jan 2018 @ 9:00pm

    Re: Re: Re: Re: Re:

    If you truly think anyone is capable of reasonable discourse, why provoke?

    I provoke only when someone has proven themselves either incapable or unwilling to debate in good faith. If I disagree with someone here, I give them the opportunity to debate me in good faith, which is why I try to hold off on using personal insults unless they demonstrate a lack of good faith in debating me. My experience is my own; your mileage may vary.

    using inflammatory rhetoric poisons the well from the get-go and gives ammo to alt-right idealogues to recruit non-partisan trolls to join in their campaigns

    “Alt-right” ideology runs on inflammatory rhetoric; if anything, the “alt-right” revels in using it. If they can “own the libs” by provoking someone with liberal/progressive/left-wing political beliefs into the political equivalent of a “Yo’ Momma” battle, the “alt-right” will do exactly that.

    It obscures rational cost/benefit analysis and is mainly an excuse to gain mob support behind an issue that the Big Boys And Girls already decided behind closed doors.

    That is, in a sense, my point: Whereas “family reunification” is an accurate phrasing of that immigration policy, “chain migration” is the phrase that seeped into the mainstream because it is “punchier”. The latter phrase obscures the actual policy (reuniting families through a legal immigration process) and creates a more frightening image of immigrants, related or not, coming into the country one after the other in a long “chain” that makes the US less safe.

    This kind of obscurement tends to come from conservatives/right-wingers far more than it does from liberals/left-wingers, by the way. Look at the whole uproar over “death panels” as another example.

    I find someone sending in a DNA test to a social platform to be absolutely absurd

    It is. It absolutely is. And Twitch higher-ups should be ashamed of themselves if they do not immediately overturn her ban.

    My disappointment remains.

    You say that like I’ve never heard it before.

    If this is an omen of how the debate will play out in the coming months, I expect it to devolve into little more than name calling.

    As stated above, I am more than willing to debate in sincere good faith. But when I think someone is trying to take advantage of that, be it by “otherwording” me or by resorting to insults of their own or some other bad-faith debate tactic, I let loose with the venom and bile. I insult Mr. SovCit right off the bat, for example, because they have proven time and time again that they have more interest in trolling this site than in having a good-faith debate on the subject at hand.

    If someone avoids the vitriol, I treat them in kind. If they spit in my face, I treat them in kind. You only get what you give.

    link to this | view in thread ]

  38. icon
    ECA (profile), 30 Jan 2018 @ 2:12am

    I happened in the past..

    They DID it to newspapers..
    THEY did it to TV...
    NOW to try and get the net to do it..
    Its a backdoor world.

    link to this | view in thread ]

  39. icon
    Richard (profile), 30 Jan 2018 @ 3:03am

    Re: Re: Re: Re: Re: Re:

    That is, in a sense, my point: Whereas “family reunification” is an accurate phrasing of that immigration policy, “chain migration” is the phrase that seeped into the mainstream because it is “punchier”.

    Family reunification is the laudable purpose that was in the minds of those who framed the law.

    Chain migration is an unintended consequence that can happen when people take advantage of the law.

    Unfortunately is is very hard to frame a law in such a way as to allow the former whilst preventing the latter.

    We all know of hard cases that occur even under current, relatively relaxed, immigration rules - so the unintended consequences do work both ways.

    The real problem is that it will be impractical to have reasonable immigration rules whilst the countries that source immigration are in such a messed up state.

    In some places this will probably fix itself within a couple of generations (eg most of Eastern Europe).

    In other places it is a matter of better governance and a certain amount of aid/inward investment. (eg Sub Saharan Africa, Latin America).

    In still others there are cultural/religious issues that make genuine progress difficult - several of these countries have actually regressed in recent years (eg Iran, Turkey, Pakistan and most of the Middle East/North Africa.

    In reality the rhetoric of neither tha alt-right nor the left on its own will solve things like this. The left have to admit that the conservative, even the alt right are actually correct on some issues, and the right have to admit that the underlying principles of the left are correct - evenn though they have messed up in some areas.

    Unfortunatley both sides are moving apart into their own spaces. The reason that google/facebook etc should not censor the conservative voices is not for the sake of the conservatives - it is for the sake of the left - who need to hear those voices - in order to correct their own mistakes.

    link to this | view in thread ]

  40. icon
    The Wanderer (profile), 30 Jan 2018 @ 6:19am

    Re: Re: Re: Re: Re: Re: Re:

    There was a piece on NPR recently (I think yesterday, in fact) which pointed out that the "family reunification" immigration policy was put in place back when most existing immigrants were from European - read, white - countries, as a "backdoor" way of restricting immigration from countries with darker-skinned populations, and that it was proposed (by a Democrat) as an alternative to a merit-based immigration system. It's just that over the intervening decades, enough people from those other countries got in by other means (e.g., by marrying US citizens) that bringing in their families has resulted in more existing immigrants being from those other countries than from the "white" ones.

    So the racists have gone from opposing merit-based immigration and pushing "family reunification"-based immigration as a way to keep out the darkies, to pushing merit-based immigration and opposing "family reunification" as a way to keep out the darkies.

    The party affiliations of the racists may have shifted somewhat, and the exact policies they push to achieve their goals may be different, but the goals themselves seem more or less unchanged.

    link to this | view in thread ]

  41. icon
    That One Guy (profile), 30 Jan 2018 @ 6:40am

    Re: Re:

    You cannot force a platform to host your speech. I have no idea why you would even want to try.

    Because some people apparently believe that they are owed a platform to speak from, consequence free, and that the fact that a non-government individual/company owns that platform should not be any bar to their 'right' to use it.

    Or put another way, while they want people to respect their free speech rights, even the 'rights' they don't actually have, they apparently don't believe that those running platforms should enjoy free speech rights of their own, and/or be allowed to exercise any control over their platform in ways that might infringe the non-existent 'rights' they are claiming.

    link to this | view in thread ]

  42. identicon
    Anonymous Coward, 30 Jan 2018 @ 6:51am

    Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    The sort of shit you've said here makes me side with the other guy. JS.

    link to this | view in thread ]

  43. icon
    That One Guy (profile), 30 Jan 2018 @ 6:52am

    "The company I don't like should be required to host that!" "What about the one you DO like?"

    Question! Do you believe companies such as Google should be forced to host advertisements from anyone with the cash to afford an ad spot?

    A more apt question might be whether they believe that the sites they listed should be forced to host advertisements from anyone with the cash to pay for the spots, no matter the ideological difference or violations in any TOS they might have.

    I mean it's one thing to say that the big bad mega-corp should be required to host content no matter the source, but when that gets turned around such that people they are siding with might have to host content they don't agree with, that can be quite another matter if they aren't willing to apply the rule equally.

    link to this | view in thread ]

  44. identicon
    Anonymous Coward, 30 Jan 2018 @ 6:58am

    Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    >Lawyers, reporters, and SovCits are the only assholes who use the phrase “corporate persons” with sincerity. To which one of the three groups do you belong?

    Absolutely disgusting display of intellectual dishonesty here. You should be ashamed.

    link to this | view in thread ]

  45. icon
    That One Guy (profile), 30 Jan 2018 @ 7:37am

    Re: Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    Yes, they should absolutely feel ashamed for... what again?

    link to this | view in thread ]

  46. identicon
    Anonymous Coward, 30 Jan 2018 @ 7:44pm

    Re: Re: Re: Re: Re: Re:

    I provoke only when someone has proven themselves either incapable or unwilling to debate in good faith. If I disagree with someone here, I give them the opportunity to debate me in good faith, which is why I try to hold off on using personal insults unless they demonstrate a lack of good faith in debating me. My experience is my own; your mileage may vary.

    I should clarify my stance:

    I hope discourse on the topic of moderation moves away from proactive antagonization.

    “Alt-right” ideology runs on inflammatory rhetoric; if anything, the “alt-right” revels in using it. If they can “own the libs” by provoking someone with liberal/progressive/left-wing political beliefs into the political equivalent of a “Yo’ Momma” battle, the “alt-right” will do exactly that.

    You conflate ideology with culture.

    What I'm saying is that, for the past several years, I've seen alt-right propagandists successfully winning the "hearts and minds" of relatively non-partisan persons by focusing on the political insanity within tech and the associated censorship. Responding to an antagonist with antagonism is a-ok but very often I find antagonism is proactive.

    That is, in a sense, my point: Whereas “family reunification” is an accurate phrasing of that immigration policy, “chain migration” is the phrase that seeped into the mainstream because it is “punchier”. The latter phrase obscures the actual policy (reuniting families through a legal immigration process) and creates a more frightening image of immigrants, related or not, coming into the country one after the other in a long “chain” that makes the US less safe.

    While I'm a-ok with migrants bringing their families, I find the choices of framing you present to be silly. It's like comparing a cute kitten and a man-eating lion.

    link to this | view in thread ]

  47. identicon
    Anonymous Coward, 30 Jan 2018 @ 7:46pm

    Re: Re: Re:

    When a private company takes over the commons, I think it a compelling argument that they should have additional obligations put on them accordingly.

    link to this | view in thread ]

  48. icon
    Stephen T. Stone (profile), 30 Jan 2018 @ 8:54pm

    Re: Re: Re: Re: Re: Re: Re:

    I've seen alt-right propagandists successfully winning the "hearts and minds" of relatively non-partisan persons by focusing on the political insanity within tech and the associated censorship.

    And how do they frame their arguments—in “neutral” terms or in the carefully-chosen phrasing used by “alt-right” ideologues to stoke maximum outrage?

    I find the choices of framing you present to be silly. It's like comparing a cute kitten and a man-eating lion.

    No, my comparison would be like referring to lions as “felines” and someone on the “alt-right” referring to lions as “man-eaters”. The former is intentionally neutral; the latter is used with the intent of scaring people.

    link to this | view in thread ]

  49. icon
    Stephen T. Stone (profile), 30 Jan 2018 @ 9:01pm

    Re: Re: Re: Re:

    When a private company takes over the commons, I think it a compelling argument that they should have additional obligations put on them accordingly.

    I want to test the limits of your stance here. If Twitter’s management team decided to shut down the entire service next week, should any government in the world have the right to force a privately owned/operated Internet service into staying open as if it were a public utility?

    link to this | view in thread ]

  50. identicon
    Anonymous Coward, 30 Jan 2018 @ 9:39pm

    Re: Re: Re: Re: Re: Re: Re: Re:

    And how do they frame their arguments—in “neutral” terms or in the carefully-chosen phrasing used by “alt-right” ideologues to stoke maximum outrage?

    Almost exclusively neutral terms except whrn painting right-wing persons as victims or painting left or left-wing as unhinged, cartoonishly bigoted bullies.

    They take great pains to avoid mentioning alt-right politics but frequently valorize the ones that have been targeted by activists as victims.

    "[Visual novel author] is being censored for his opinions! The Left pressured his publisher into dropping him!"

    (I go to see what his views are and find his Visual Novel is labelled alt-Hero)

    I admit I find some of their complaints with merit where the antagonists have just been pointlessly cruel to someone who really was minding their own business. But I don't know what has merit vs what is horseshit until I look for myself.

    link to this | view in thread ]

  51. identicon
    Anonymous Coward, 30 Jan 2018 @ 10:38pm

    Re: Re: Re: Re: Re:

    That's irrrlevant to my point. I wasn't claiming social media companies are public utilities. Just that someone being banned from the public square that everyone congregates at is a big fucking deal and that saying they're free to go to the empty bar the next town over isn't remotely an equitable alternative.


    That said, I'd think it prudent for there to be plans in place were Facebook to ever fall.

    link to this | view in thread ]

  52. identicon
    Anonymous Coward, 31 Jan 2018 @ 12:57am

    Re: Re: Re: Re:

    What commons have the social media companies taken over? They are like other privately owned spaces where people congregate, like cafe, pubs and clubs, and have the right to control who comes through their doors, and standards of behavior when inside their premises.

    link to this | view in thread ]

  53. icon
    Stephen T. Stone (profile), 31 Jan 2018 @ 2:23am

    Re: Re: Re: Re: Re: Re:

    I wasn't claiming social media companies are public utilities.

    You referred to “a private company [taking] over the commons”. This phrasing implies a view of services such as Twitter as a public utility, insofar as “the public square” can be considered a public utility. If I am otherwording you, I apologize for the misinterpretation of your comment.

    (That said: You technically did not answer the question I raised.)

    someone being banned from the public square that everyone congregates at is a big fucking deal and that saying they're free to go to the empty bar the next town over isn't remotely an equitable alternative.

    That sucks for someone who gets banned. It still does not explain why Twitter or Facebook should have “extra obligations” just because a service is widely-used and popular. Give me an actual argument for such “obligations”—what they are and why you believe them to be necessary—and I will have more to say on the matter.

    I'd think it prudent for there to be plans in place were Facebook to ever fall.

    The only plans that need to be made in this regard are the plans by Facebook to destroy every single byte of user data after the service shuts down for good, such that no one can ever use or abuse that data in any way. Other than that, I cannot think of any “plans” that must be made by anyone in the event of a Facebook shutdown.

    link to this | view in thread ]

  54. identicon
    Anonymous Coward, 31 Jan 2018 @ 12:41pm

    Re: Re:

    You referred to “a private company [taking] over the commons”. This phrasing implies a view of services such as Twitter as a public utility, insofar as “the public square” can be considered a public utility. If I am otherwording you, I apologize for the misinterpretation of your comment.

    The social commons is whatever everyone congregates at. If no one congregates there, it's no longer the commons.

    The reason it's not a utility is because the social media site itself isn't anything special. The infrastructure isn't special. What's special is access to the people.

    To stretch your utility comparison further, a power company that doesn't provide power isn't an equitable alternative to one that does. That they both have power lines running to your residence and that both have the ability to carry power to your house doesn't matter if, at the end of the day, only one of them is actually able to deliver electricity.

    That sucks for someone who gets banned. It still does not explain why Twitter or Facebook should have “extra obligations” just because a service is widely-used and popular. Give me an actual argument for such “obligations”—what they are and why you believe them to be necessary—and I will have more to say on the matter.

    Do you truly not understand the value of social interaction from a professional and personal standpoint?

    The obligations I speak of are transparency and consistency in how moderation is carried out. Less of this ad-hoc stuff where pressure groups effectively dictate changes to moderation on an ad-hoc and double standard basis.

    The only plans that need to be made in this regard are the plans by Facebook to destroy every single byte of user data after the service shuts down for good, such that no one can ever use or abuse that data in any way. Other than that, I cannot think of any “plans” that must be made by anyone in the event of a Facebook shutdown.

    An enormous number of people rely on Facebook for social and business purposes. If Facebook were to suddenly disappear, there'd be chaos. To many people, Facebook is the Internet. There's many easily-forseeable consequences that would likely result if Facebook were to shut down. It'd be irresponsible for civil society and governments to not have contingency plans.

    link to this | view in thread ]

  55. identicon
    Anonymous Coward, 31 Jan 2018 @ 12:44pm

    Re: Re: Re: Re: Re:

    If there were only 3 reasonably populated pubs in the entire Western world, that comparison would hold.

    link to this | view in thread ]

  56. icon
    Stephen T. Stone (profile), 31 Jan 2018 @ 1:35pm

    Re: Re: Re: Re: Re: Re:

    It already holds. Twitter is a privately-owned service (“space”) where people congregate, and Twitter management has the right to control who gets to use the service (“come through their doors”). They also have the right to set “standards of behavior” for people who are on the service (“inside their premises”). That Twitter is a “club” with a membership of several million people, spambots, Russian disinformation bots, and James Woods does not change those facts.

    link to this | view in thread ]

  57. icon
    Stephen T. Stone (profile), 31 Jan 2018 @ 1:58pm

    Re: Re: Re:

    (Psst. You replied to the wrong reply.)

    The social commons is whatever everyone congregates at. If no one congregates there, it's no longer the commons.

    In reference to the idea of the social commons, you brought up the notion of “the public square”. That phrase conjures the image of an actual public square—that is, a public place where members of the general public can legally congregate and speak their minds. Twitter is not such a space, no matter how many people use the service.

    a power company that doesn't provide power isn't an equitable alternative to one that does. That they both have power lines running to your residence and that both have the ability to carry power to your house doesn't matter if, at the end of the day, only one of them is actually able to deliver electricity.

    This analogy would make more sense if you could show me what “power” is supposed to be in the Twitter-to-power company comparison. And just for the record, any competitor to Twitter is an alternative. Whether it is an “equitable” one really does not matter, since any given Twitter competitor is the same thing as Twitter: a privately owned/operated service.

    Do you truly not understand the value of social interaction from a professional and personal standpoint?

    I do. That still does not explain why Twitter has any responsibility or obligation to function as if it were a true “public square”.

    The obligations I speak of are transparency and consistency in how moderation is carried out.

    If you had said that from the start, you might have saved yourself quite a bit of typing.

    An enormous number of people rely on Facebook for social and business purposes. If Facebook were to suddenly disappear, there'd be chaos. To many people, Facebook is the Internet. There's many easily-forseeable consequences that would likely result if Facebook were to shut down. It'd be irresponsible for civil society and governments to not have contingency plans.

    None of those plans, whatever they may be, should involve forcing Facebook management to keep the service open indefinitely because it is a supposed “lynchpin of society”. Besides, relying on one company or service for a narrowly specific purpose, such as people relying on Facebook to be “the Internet” for them, is a disaster waiting to happen.

    link to this | view in thread ]

  58. identicon
    Anonymous Coward, 31 Jan 2018 @ 2:35pm

    Re: Re: Re: Re:

    In reference to the idea of the social commons, you brought up the notion of “the public square”. That phrase conjures the image of an actual public square—that is, a public place where members of the general public can legally congregate and speak their minds. Twitter is not such a space, no matter how many people use the service.

    The modern day choices are Twitter and Facebook and 4chan.

    Social media has consolidated enormously. The options for a well populated speech platform aren't particularly plentiful these days.

    This analogy would make more sense if you could show me what “power” is supposed to be in the Twitter-to-power company comparison. And just for the record, any competitor to Twitter is an alternative. Whether it is an “equitable” one really does not matter, since any given Twitter competitor is the same thing as Twitter: a privately owned/operated service.

    "Power" in this context is access to other people. The value in social media isn't to speak, but to converse.

    I've made my stance clear that a company which has taken over a large part of the commons should be subject to additional obligations accordingly. I'm not of the mind that social media companies will always behave magnanimously. Why so many people think they will baffles me.

    If you had said that from the start, you might have saved yourself quite a bit of typing.

    That goes both ways. You reflexively dismissed the concept of social media companies having any degree of obligation in regards to their moderation policy because, presumably, their current enforcement is ostensibly focused against repulsive people.

    None of those plans, whatever they may be, should involve forcing Facebook management to keep the service open indefinitely because it is a supposed “lynchpin of society”.

    I said nothing to the effect of forcing a service's management to keep that service open indefinitely. What are you even responding to?

    Besides, relying on one company or service for a narrowly specific purpose, such as people relying on Facebook to be “the Internet” for them, is a disaster waiting to happen.

    It already is a disaster waiting to happen. That's why I'm saying it'd be prudent to have contingency plans.

    link to this | view in thread ]

  59. icon
    Stephen T. Stone (profile), 31 Jan 2018 @ 3:10pm

    Re: Re: Re: Re: Re:

    Social media has consolidated enormously. The options for a well populated speech platform aren't particularly plentiful these days.

    Hence why lots of people are trying to break down the major Internet silos with the Open Web and decentralization efforts. The Mastodon protocol is a good example: Rather than focusing on making a single silo to compete with Twitter, the Masto devs made a protocol that anyone could host and customize to their liking. The federation part of the protocol allows for Masto instances to connect with each other and form a larger network without forcing users to remain on a single instance. For all the guff people might give it, Mastodon is a great idea, and I would love to see its primary idea of decentralized federation become the driving force behind new protocols and services—like, say, a protocol that could compete with Tumblr.

    I've made my stance clear that a company which has taken over a large part of the commons should be subject to additional obligations accordingly.

    Besides the before-mentioned obligation to be transparent and consistent with moderation, what other obligations should be thrust upon services like Twitter?

    I'm not of the mind that social media companies will always behave magnanimously. Why so many people think they will baffles me.

    You will find no disagreement here. Corporations are inherently sociopathic and should be treated as such. Ditto for “brands”.

    You reflexively dismissed the concept of social media companies having any degree of obligation in regards to their moderation policy because, presumably, their current enforcement is ostensibly focused against repulsive people.

    I am not dismissive of the notion. (Twitter would benefit from both transparency and moderation consistency.) My objection comes from the word “obligation” and the context in which it is used here. Twitter is still a privately owned company regardless of its supposed “commons” status. Someone who says it must have further obligations pushed upon it beyond what is prescribed by law tends to skeeve me out.

    I said nothing to the effect of forcing a service's management to keep that service open indefinitely. What are you even responding to?

    My wariness of the word “obligation” in this context, your references to “the public commons”, and your talk of “making plans” in the case of a shutdown of a service like Twitter might have led me to think that you believe the government should have at least some say in whether such a service gets to close. If I went overboard with my thinking, I apologize for my fuck-up.

    link to this | view in thread ]

  60. identicon
    Anonymous Coward, 31 Jan 2018 @ 4:24pm

    Re: Re: Re: Re: Re: Re:

    Hence why lots of people are trying to break down the major Internet silos with the Open Web and decentralization efforts. The Mastodon protocol is a good example: Rather than focusing on making a single silo to compete with Twitter, the Masto devs made a protocol that anyone could host and customize to their liking. The federation part of the protocol allows for Masto instances to connect with each other and form a larger network without forcing users to remain on a single instance. For all the guff people might give it, Mastodon is a great idea, and I would love to see its primary idea of decentralized federation become the driving force behind new protocols and services—like, say, a protocol that could compete with Tumblr.

    Nice in theory, but the implementation left much to be desired.

    The people that run Mastodon have tried using their "biggest instance" influence to shame and, in the case of the biggest Japanese instance, attempt blackmail to get other instances to adopt their specific moderation and content policies.

    I was livid when I saw this in action. It was an attempt to build a censorship-resistant platform and then censor large chunks of it. They said it was to ban Nazis but just like much content moderation these days it went very far beyond that.

    Besides the before-mentioned obligation to be transparent and consistent with moderation, what other obligations should be thrust upon services like Twitter?

    I'd say some degree of due process. Threre are currently double standards with regard to moderation that shouldn't be seen as acceptable.

    You will find no disagreement here. Corporations are inherently sociopathic and should be treated as such. Ditto for “brands”.

    The desire to make money is definitely a strong one for corporations, but it hardly operates in a vacuum from the personal sentiments of a given corporation's leadership. Just my personal view.

    I am not dismissive of the notion. (Twitter would benefit from both transparency and moderation consistency.) My objection comes from the word “obligation” and the context in which it is used here. Twitter is still a privately owned company regardless of its supposed “commons” status. Someone who says it must have further obligations pushed upon it beyond what is prescribed by law tends to skeeve me out.

    Twitter and Facebook took overt actions to replace countless smaller social media outlets. I don't think this should come without a cost beyond the added infrastructure.

    If I went overboard with my thinking, I apologize for my fuck-up.

    No worries. Any governmental planning I was thinking about would be emergency services whose primary outreach medium right now is Facebook.

    link to this | view in thread ]

  61. icon
    Stephen T. Stone (profile), 31 Jan 2018 @ 5:39pm

    Re: Re: Re: Re: Re: Re: Re:

    It was an attempt to build a censorship-resistant platform and then censor large chunks of it.

    When it comes to that specific instance—on which I have an account, for full disclosure—their rules and code of conduct align with what I want from a social media service. But Mastodon is a protocol, not a service, and it is not limited to mastodon.social alone. The whole point of developing it as a protocol was to avoid siloing the software and its users into one site. Don’t like m.s’s policies? Find an instance that you do like and encourage others to join you. You do not have to give up on Mastodon because of one “bad” instance.

    I'd say some degree of due process. Threre are currently double standards with regard to moderation that shouldn't be seen as acceptable.

    Therein lies a big issue: How would you design due process for moderation of a site with millions of users, billions of posts, and nowhere near enough actual humans to work behind the scenes on fair and consistent moderation?

    Not to say I disagree with your notion. It is an ideal to work toward. Getting there, however, would be a logistical nightmare.

    Twitter and Facebook took overt actions to replace countless smaller social media outlets. I don't think this should come without a cost beyond the added infrastructure.

    Their potential obligations for being the major players must be weighed against their rights. Punishing the major players just for being the major players reeks of petty vengeance.

    Any governmental planning I was thinking about would be emergency services whose primary outreach medium right now is Facebook.

    Ah, see, now that is an interesting notion.

    link to this | view in thread ]

  62. identicon
    Anonymous Coward, 1 Feb 2018 @ 12:00pm

    Re: Re: Re: Re: Re: Re: Re: Re:

    > When it comes to that specific instance—on which I have an account, for full disclosure—their rules and code of conduct align with what I want from a social media service. But Mastodon is a protocol, not a service, and it is not limited to mastodon.social alone. The whole point of developing it as a protocol was to avoid siloing the software and its users into one site. Don’t like m.s’s policies? Find an instance that you do like and encourage others to join you. You do not have to give up on Mastodon because of one “bad” instance.

    The social reality is that most people won't go to the instance cut off from the one instance with 80%+ of the English speaking Mastodon population.

    > Therein lies a big issue: How would you design due process for moderation of a site with millions of users, billions of posts, and nowhere near enough actual humans to work behind the scenes on fair and consistent moderation?

    > Not to say I disagree with your notion. It is an ideal to work toward. Getting there, however, would be a logistical nightmare.

    Perhaps start with making their "case law" public? I get there's concern with the system being gamed but the alternative also has noxious externalities.



    >Their potential obligations for being the major players must be weighed against their rights. Punishing the major players just for being the major players reeks of petty vengeance.

    I care far more about individual liberty than I do business liberty. That's not to say I want businesses regulated a bunch but I opt for individual liberty when business liberty and business liberty are at odds.

    link to this | view in thread ]

  63. icon
    Stephen T. Stone (profile), 1 Feb 2018 @ 2:11pm

    Re: Re: Re: Re: Re: Re: Re: Re: Re:

    I have neither the energy or intellectual wherewithal to continue this discussion past this point. I do, however, have it in me to thank you for this line of discussion. It has given me several things to think about for future discussion on these subjects, and that is about as good as it gets.

    link to this | view in thread ]

  64. identicon
    Anonymous Coward, 1 Feb 2018 @ 2:43pm

    Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:

    Anytime.

    link to this | view in thread ]

  65. identicon
    Anonymous Coward, 1 Feb 2018 @ 5:51pm

    Re: Re: Re: Re: "UGC service's decision to publish" -- No, "platforms" neither make the decision nor PUBLISH, or they'd be liable: "platforms" are practically IMMUNE under Section 230, CDA.

    This has been Slonecker's shtick for a while. He's moved beyond tapdancing around Masnick while saying "I know better than you and I have evidence to prove it" because he never shows his work. Because there's no work to show.

    Now that he's realized everyone who has a brain is going to call him out on his bullshit he's decided to opt for troll apologetics, soothing the butthurt of MyNameHere and out_of_the_blue. Authoritarian fellatio artists gotta stick together.

    link to this | view in thread ]

  66. identicon
    Anonymous Coward, 13 Feb 2018 @ 6:55pm

    Re: Re:

    TechDirt does not deny the alliance. In fact, they have a big colorful graphic to demonstrate who their masters are.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.