Why Is Wired So Focused On Misrepresenting Section 230?

from the it's-bizarre dept

We've already highlighted our concerns with Wired's big cover story on Section 230 (twice!). The very same day that came out, Wired UK published a piece by Prof. Danielle Citron entitled Fix Section 230 and hold tech companies to account. Citron's proposal was already highlighted in the cover story and now gets this separate venue. For what it's worth, Citron also spent a lot of energy insisting that the Wired cover story was the "definitive" article on 230 despite all of its flaws, and cheered on and liked tweets by people who mocked my arguments for why the article is just not very accurate.

Over the last few years, we've also responded multiple times to Citron's ideas, which are premised on a completely false narrative: that without a legal cudgel, websites have no incentive to keep their sites clean. That's clearly not true at all. If a company doesn't moderate, then it turns into a garbage dump of spam, harassment, and abuse. They lose users. They lose advertisers. It's just not good business. There are plenty of incentives to deal with bad stuff online -- though Citron never seems to recognize any of that, and insists, instead, that every site is a free-for-all because 230 does not provide them legal liability. This latest piece in Wired is more of the same.

American lawmakers sided with the new inventors, young men (yup, all men) who made assurances that they could be trusted with our safety and privacy. In 1996, US Congress passed Section 230 of the Communications Decency Act, which secured a legal shield for online service providers that under- or over-filtered third-party content (so long as aggressive filtering was done in good faith). It meant that tech companies were immune to lawsuits when they removed, or didn’t remove, something a third party posted on their platforms.

That's only partially accurate. The "good faith" part only applies to one subsection of the bill, and is not the key reason why sites can be "aggressive" in filtering. That's the 1st Amendment. The immunity part is a procedural benefit that prevents abusive litigation by those seeking to waste the court's time by filing frivolous and expensive lawsuits -- or, more likely, threaten to do so, if websites won't remove perfectly legal content they just don't like.

But, thanks to overbroad court rulings, Section 230 ended up creating a law-free zone. The US has the ignominious distinction of being a safe haven for firms hosting illegality.

This is just wrong. And Citron knows it's wrong, and it's getting to be embarrassing that she (and Wired) would repeat it. First, Section 230 has no impact on federal criminal law, so anything that violates federal criminal law is not safe. Second, there are almost no websites that want "illegal" content on their site. Most have teams of people who deal with such reports or court orders. Indeed, the willingness of websites to quickly remove any content deemed illegal has been abused by reputation management firms to get content removed via faked judicial orders, or through a convoluted scheme involving fake defendants who "settle" lawsuits just to get a court order out of it.

This isn’t just an American pathology: Because the dominant social media companies are global, illegality they host impacts people worldwide. Indeed, safety ministers in South Korea and Australia tell me that they can help their citizens only so much, since abuse is often hosted on American platforms. Section 230 is to social media companies what the Cayman Islands has long been to the banking industry.

Over and over again we've seen the exact opposite of this, in two separate, but important ways. First, many of these companies still are more than willing to geoblock content if it's found to violate the law in a certain country. However, much more importantly, the ability of US-based websites to keep content up means that threatened, marginalized, and oppressed people are actually able to get their messages out. Oppressive governments around the world, including in places like Turkey and India have sought to force websites to take down content that merely criticizes their governments.

Any reasonable discussion on this needs to take that into account in demanding that "illegal" content must automatically be taken down. And when weighed with the fact that most companies don't want to host truly illegal and problematic content, most of the content that is likely to be removed without those protections is exactly the kind of authoritarians-suppressing-speech content that we're concerned about.

Tech companies amplify damaging lies, violent conspiracies and privacy invasions because they generate copious ad revenue from the likes, clicks and shares. For them, the only risk is bad PR, which can be swiftly dispatched with removals, bans and apologies.

This is stated without anything backing it up and it's garbage. It's just not true. All of the big companies have policies in place against this content, and they (unlike Citron) recognize that it doesn't "generate copious ad revenue from the likes, clicks and shares" (likes, clicks and shares don't directly generate ad revenue...). These companies know that the long-term health of their platforms is actually important, and losing advertisers and users because of garbage is a problem. This is why Facebook, Twitter, and YouTube all have teams working on these issues and trying to keep the platforms in better shape. They're certainly not perfect at it, but part of that is because of the insane scale of these platforms and the ever changing nature of the problematic content on those platforms.

I know that among a certain set it's taken on complete faith that no one at these companies cares, because they just "want clicks" and "clicks mean money." But that shows an astounding disconnect from what the people at these companies, and those setting and enforcing these policies actually think. It's just ivory tower nonsense, completely disconnected from reality.

For individuals and society, the costs are steep. Lies about mask wearing during the Covid-19 pandemic led to a public health disaster and death.

Which spread via cable news more than on social media, and included statements from the President of the United States of America. That's not a Section 230 problem. It's also not something that changing Section 230 fixes. Most of those lies are still Constitutionally protected. Citron's problem seems to be with the 1st Amendment, not Section 230. And changing Section 230 doesn't change the 1st Amendment.

Plans hatched on social media led to an assault on the US Capitol. Online abuse, which disproportionately targets women and minorities, silences victims and upends careers and lives.

These are both true, but it's an incredible stretch to say that Section 230 was the blame for either of these things. The largest platforms -- again, Facebook, YouTube, Twitter, etc. -- all have policies against this stuff. Did they do a bad job enforcing them? Perhaps! And we can talk about why that was, but I can assure you it's not because "230 lets us ignore this stuff." It's because it's not possible to magically make the internet perfect.

Social media companies generally have speech policies, but content moderation is often a shell game. Companies don’t explain in detail what their content policies mean, and accountability for their decisions isn’t really a thing. Safety and privacy aren’t profitable: taking down content and removing individuals deprives them of monetizable eyes and ears (and their data). Yes, that federal law gave us social media, but it came with a heavy price.

This is the only point in which Citron even comes to close to acknowledging that the companies actually do make an effort to deal with this stuff, but then immediately undermines it by pretending they really don't care about it. Which is just wrong. At best it could be argued that the platforms didn't care enough about it in 2010. But that was a century ago in internet years, and it's just wrong now. And, "taking down content and removing individuals deprives them of monetizable eyes and ears (and their data)" only if those particular eyes and ears aren't scaring off many more from their platform. And every platform now recognizes that the trolls and problem makers do exactly that. Citron, incorrectly again, completely misses that these companies now recognize that not all users are equal, and trolls and bad actors do more damage to the platform than they're worth in "data" and "ad revenue."

It feels like Citron's analysis is stuck in the 2010 internet. Things have changed. And part of the reason they've changed is that Section 230 has allowed companies to freely experiment with a variety of remedies and solutions to best deal with these problems.

Are there some websites that focus on and cater to the worst of the worst? There sure are. And if she wanted to focus in on just those, that would be an interesting discussion. Instead, she points to the big guys, who are not acting the way she insists they do, to demand they do... what they already do, and insists we need to change the law to make that happen, while ignoring all of the actual consequences of such a legal change.

The time for having stars in our eyes about online connectivity is long over. Tech companies no longer need a subsidy to ensure future technological progress.

It's not a subsidy to properly apply legal liability to the actual problematic parties. It's a way of saving the judicial system from a ton of frivolous lawsuits and avoiding the ability to censor by proxy by giving aggrieved individuals the ability to silence critics by mere threats of litigation to third party platforms.

If anything, that subsidy has impaired technological developments that are good for companies and society.

Uh, no. 230's flexibility has allowed a wide range of different platforms to try a variety of different approaches, and to seek out the best approaches for that kind of community. Wikipedia's approach is different from Facebooks which is different from Reddit's which is different from Ravelry's which is different from Github's. That's because we have 230 that allows for these different approaches. And all of those companies are trying to come up with solutions that are "good for society" because if they don't, their sites turn into garbage dumps and people will seek out alternatives.

We should keep Section 230 – it provides an incentive for companies to engage in monitoring – but condition it on reasonable content moderation practices that address illegality causing harm. Companies would design their services and practices knowing that they might have to defend against lawsuits unless they could show that they earned the federal legal shield.

The issue with this is that if you have to first prove "reasonableness" you end up with a bunch of problems, especially for smaller sites. First, you massively increase the costs of getting sued (and as such, you vastly increase the ability of threats to have their intended effect to take down content that is perfectly legal). Second, in order to prove "reasonableness" many, many, many lawyers are going to say "just do what the biggest companies do" because that will have been shown in court to be reasonable. So, instead of getting more "technological developments that are good for companies and society" you get homogenization. You lose out on the innovation. You lose out on the experimentation for better models, because any new model is just a model that hasn't been tested in court yet and leaves you open to liability.

For the worst of the worst actors (such as sites devoted to nonconsensual porn or illegal gun sales), escaping liability would be tough. It’s hard to show that you have engaged in reasonable content moderation practices if hosting illegality is your business model.

This is... already true? Various nonconsensual porn sites have been taken down by both civil lawsuits and criminal prosecution over the years. Companies entirely engaged in illegal practices still face federal criminal prosecution as well without 230's protections. On top of that, courts themselves have increasingly interpreted 230 to not shield those worst of the worst actors.

Over time, courts would rule on cases to show what reasonableness means, just as courts do in other areas of the law, from tort and data security to criminal procedure.

Right. And then anyone with a better idea on how to build a better community online would never dare to risk the liability that came with having to first prove it "reasonable" in court.

In the near future, we would see social media companies adopt speech policies and practices that sideline, deemphasize or remove illegality rather than optimise to spread it.

Again, no mainstream site wants "illegality" on their site. This entire article is premised on a lie, backed up with misdirection and a historical myth.

There wouldn’t be thousands of sites devoted to nonconsensual porn, deepfake sex videos and illegal gun sales. That world would be far safer and freer for women and minorities.

Except there's literally no evidence to support this argument. We know what happened in the copyright space, which doesn't have 230 like protections, and does require "reasonable" policies for dealing with infringement. Infringement didn't go away. It remained. As for "women and minorities" it's hard to see how they're better protected in such a world. The entire #MeToo movement came about because people could tell their stories on social media. Under Citron's own proposal here, websites would face massive threats of liability should a bunch of people start posting #MeToo type stories. We've already seen astounding efforts by those jackasses who were exposed during #MeToo to silence their accusers. Citron's proposal would hand them another massive weapon.

The bigger issue here is that Citron refuses to recognize how (and how frequently) those in power abuse tools of content suppression to silence voices they don't want to hear. She's not wrong that there's a problem with a few narrow areas of content. And if she just focused on how to deal with those sites, her argument would be a lot more worth engaging with. Instead, she's mixing up different ideas, supported by a fantasy version of what she seems to think Facebook does, and then insisting that if they just moderated the way she wanted them to, it would all be unicorns and rainbows. That's not how it works.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, content suppression, danielle citron, section 230, tradeoffs


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 13 May 2021 @ 11:17am

    Companies would design their services and practices knowing that they might have to defend against lawsuits unless they could show that they earned the federal legal shield.

    Opening up the possibility of law suites over content moderation decision open up a catch 22 situation, where the same content will result in a law suite if its taken down or left up, as people holding either view will claim there preferred option is the the reasonable moderation decision.

    Further, pushing liability for user content onto platforms pushed them towards becoming publishers, in the sense that they look some submissions, and decide which to publish, and the rest are sent to dev null, because they do not have the resources to look at them in a timely fashion.

    link to this | view in thread ]

  2. icon
    Stephen T. Stone (profile), 13 May 2021 @ 11:18am

    The bigger issue here is that Citron refuses to recognize how (and how frequently) those in power abuse tools of content suppression to silence voices they don't want to hear.

    And sometimes not on purpose. When adult-oriented content (e.g., porn) gets driven from platforms, LGBTQ people are inevitably hit first and hardest by such bans because of preëxisting biases about LGBTQ content — namely, that such content is inherently sexual/adult-oriented.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 13 May 2021 @ 11:26am

    Apply the old adage...

    When a public act makes no sense, follow the money...

    link to this | view in thread ]

  4. icon
    Bloof (profile), 13 May 2021 @ 11:45am

    Why would a publication owned by one of the largest privately owned old media organisations in the world support misinformation that will help make it easier for the right to take a machete to laws that protect the internet as we know it? It's not like the internet becoming less usable and a far easier place to spread right wing misinformation would benefit them in any way...

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 13 May 2021 @ 11:46am

    the question, surely, should be, who is gaining what at wired out of this misrepresentation? it sure as hell aint being done for nothing!

    link to this | view in thread ]

  6. icon
    Thad (profile), 13 May 2021 @ 12:18pm

    Re:

    To be fair, ZD also owns Ars Technica, whose 230 coverage has been quite good. If it's pressure from the corporate parent, it doesn't seem like it's been applied equally to all its subsidiaries.

    link to this | view in thread ]

  7. icon
    PaulT (profile), 13 May 2021 @ 12:28pm

    Re: Apply the old adage...

    Not always true. Religion can also have an impact, and certain political factions have been turning into religious cults recently...

    link to this | view in thread ]

  8. icon
    Toom1275 (profile), 13 May 2021 @ 12:35pm

    How do you debate Section 230 when Citron keeps lying about it?

    link to this | view in thread ]

  9. icon
    That Anonymous Coward (profile), 13 May 2021 @ 12:42pm

    "Section 230 is to social media companies what the Cayman Islands has long been to the banking industry."

    And the attempts to regulate the banking industry have also fallen prey to the lies coming from 1 side about what the laws should do and can do.

    You fix the banking industry & then maybe we'll let you tell us how 230 is the worst thing ever compares to the .1% of the world controlling more money than many nations.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 13 May 2021 @ 12:53pm

    She really needs to define "illegality" better than two examples of gun sales and nonconsensual porn. Those just are not flooding the internet and never have. They are a problem, but little to do with big internet or 230, and do not exist at the scale she suggests. Further, gun sales are very non-regulated in some ways in the States, and online markets (uh, big internet companies?) for the most part fit inside those rules. If they "escape prosecution" outside the internet, what does them having an internet presence change?

    So what the hell "illegality" is she really talking about?

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 13 May 2021 @ 1:25pm

    But, thanks to overbroad court rulings

    Translation: rulings I don't like.

    link to this | view in thread ]

  12. identicon
    Anonymous Coward, 13 May 2021 @ 1:26pm

    Re:

    So what the hell "illegality" is she really talking about?

    This. Surely if there are "thousands of sites devoted to nonconsensual porn, deepfake sex videos and illegal gun sales," there must be examples of court cases where 230 shielded them from liability for their own actions, and not just the actions of their users?

    [citation needed]. I'll wait.

    link to this | view in thread ]

  13. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 13 May 2021 @ 1:32pm

    You should talk about how Twitter is trying to help kill CDA 230 by claiming it makes them immune from charges for refusing to take down child porn. After all this discussion on 230 no way twitter isnt acting stupid on purpose to give congress ammo.

    link to this | view in thread ]

  14. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 13 May 2021 @ 1:32pm

    You should talk about how Twitter is trying to help kill CDA 230 by claiming it makes them immune from charges for refusing to take down child porn. After all this discussion on 230 no way twitter isnt acting stupid on purpose to give congress ammo.

    link to this | view in thread ]

  15. icon
    That One Guy (profile), 13 May 2021 @ 1:34pm

    Fractal wrongness

    I see Wired is really working overtime to ensure that only the foolish would ever trust them on anything tech related going forward, because if they're going to give a platform to people this wrong/dishonest about 230 I can only wonder what other axes they have to grind that might impact what they cover.

    As for the article itself bloody hell is that a cavalcade of strawmen and open lies. As noted in the TD response the number of sites that are going to knowingly host 'illegal content' is going to be staggeringly low and most certainly not include the major ones, with the only exceptions that springs to mind being if you involve the laws in other countries that make illegal stuff like blasphemy and/or speaking ill of those in charge, and as someone who breaks out the 'think of the oppressed' emotional plea you'd think they'd be all for platforms not jumping all over squashing that sort of speech.

    The lines about covid and the failed insurrection are also particularly crap, as if she wants to point fingers and assign blame for those then social media might be somewhere on the list but much higher would be the gorram president at the time, politicians and 'news' shows that dismissed and stoked the fires respectively, and it takes a special kind of blindness and/or dishonesty to ignore those three groups and instead lay the blame at the feet of social media for not 'doing enough'.

    The idea that 230 protections are some sort of 'government subsidy' was rubbish when it was first presented and still is, as equality under the law is not a privilege. You don't get to sue a newspaper because someone scribbled defamatory content in the margins, you don't get to sue Walmart because someone used one of their knives to stab someone, and you don't get to sue an online platform because you don't like that someone used it to say something you don't like and the platform didn't moderate their property how you wanted them to. In all three cases if you want to sue you go after the responsible party, not the ones who provide the platform/item they used.

    One day there may be an honest criticism of 230 and argument against it, but at this rate I'm not holding my breath because to date all it's critics seem to have is lies, misrepresentations and strawmen.

    link to this | view in thread ]

  16. icon
    That One Guy (profile), 13 May 2021 @ 1:38pm

    Re:

    Yeah, I'm gonna call [Citation Needed]/shenanigans on that one, not only does 230 not cover federal crimes(one guess what category CSAM falls into) but the idea that basically any legal platform is going to knowingly host CSAM is almost literally unbelievable, as if anything they tend to lean heavily the other direction just to be safe.

    link to this | view in thread ]

  17. identicon
    Anonymous Coward, 13 May 2021 @ 2:01pm

    “Help a male colleague thinks I’m scary at work “

    “It’s true everyone’s multitasking at work”

    Wired is a graveyard run by a someone a click princess confirmed.

    link to this | view in thread ]

  18. icon
    sumgai (profile), 13 May 2021 @ 2:06pm

    Re: Re:

    To be fair and correct, Ziff-Davis has nothing to do with WIRED, which is owned by Conde Nast. ZD started in 1927, and has annual revenues of half a billion bucks. CN started in 1909, and has annual revenues of 1 & 1/2 billion dollars. ZD employee count is about 1,000, CD runs to 6,000.

    ZD owns PCMag.com (which sprang from the print edition called PC Magazine), and that print edition was born in 1982. Wired as a magazine didn't start until 1993.

    More fun facts can be found at an internet near you! ;)

    link to this | view in thread ]

  19. icon
    sumgai (profile), 13 May 2021 @ 2:10pm

    Re:

    Why?

    Because they're on the losing side in a war of attrition. They need for the internet to go away so they can regain their "king of the mountain" position that they had before electricity was discovered by none other than Tim Berners-Lee.

    (Yes, that was a bit of sarcasm.)

    link to this | view in thread ]

  20. icon
    sumgai (profile), 13 May 2021 @ 2:14pm

    Re: Re:

    [citation needed]. I'll wait.

    Might take a tad longer than you were thinking, so better bring along your Towel!

    link to this | view in thread ]

  21. identicon
    christenson, 13 May 2021 @ 2:26pm

    Re: Actual Malice

    "Twitter refusing to take down child porn"??

    This is seriously doubtful, sounds like something you know is false, which could be actual malice on your part, giving rise to liability for defamation.

    So, if you want to be believed, let's have a citation. I suspect we may find the real issue is that twitter imposes some friction in the process to prevent bots from overwhelming them with takedowns, and someone did not understand or overcome that friction.

    Another possibility is that twitter was directed to keep the tweet up by law enforcement, and bad things happen (like jail) to those that follow a link within. See the Playpen case.

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 13 May 2021 @ 2:31pm

    the only reason i can think of why wired is against section 230 is its own by a corporation that would like to reduce competition from startups and sites like reddit ,digg, other online media companys.
    removing section 230 protections would mean only the largest online media
    websites could afford to host user content, forums, or public comments
    since all websites could be targeted by trolls using legal action to bury them under large legal expenses to go to court to justify every moderation action or even just to stop spam appearing on every forum and comment thread

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 13 May 2021 @ 2:54pm

    Re: Re:

    Think of the children has become the go to excuse for ramming through bad legislation, even if someone has to lie to bring it into play.

    link to this | view in thread ]

  24. identicon
    Anonymous Coward, 13 May 2021 @ 5:02pm

    Re:

    Yeah, they're salty that just buying up existing businesses doesn't guarantee a market share due to no frequency allocation limits like TV channels.

    link to this | view in thread ]

  25. icon
    Darkness Of Course (profile), 13 May 2021 @ 5:48pm

    What's wrong with Wired?

    Well, nearly everything. Their cheap subscription rate is offset by sending out the magazine as well. I don't have a need for the printed magazine. And I do not want it.

    But, I ditched them years ago, after < 1y subscribed. Too many blunders, unchecked facts, and supporting people with an axe to grind.

    link to this | view in thread ]

  26. icon
    John Roddy (profile), 13 May 2021 @ 6:23pm

    Hi! I see you've discovered that lawsuit filed by Moronity in Media. Perhaps you should try reading it before anything else, because it doesn't actually accuse them of any of that. Nor does Twitter's motion to dismiss make those claims.

    Please stop stanning for the censorship brigade.

    link to this | view in thread ]

  27. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 13 May 2021 @ 8:20pm

    Some sites that exist solely to destroy people's reputations have managed to survive, and some even get regular financial support.

    Citron is right: the law is the best way to prevent those "isolated" instances where free-market capitalism fails.

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 13 May 2021 @ 8:24pm

    Re: Re: Re:

    Conde Naste makes money off of the enormous mailing list it compiles through publication of niche magazines (Street & Smith is a good example) that break even on publishing but make a fortune on the subscriber list.

    I doubt they care one way or another about Section 230.

    link to this | view in thread ]

  29. identicon
    anon, 13 May 2021 @ 8:52pm

    Re: What's wrong with Wired?

    link to this | view in thread ]

  30. icon
    PaulT (profile), 13 May 2021 @ 9:35pm

    Re:

    Come on, you can't just drop that mind of claim and not provide a citation... Is the reason for that because the facts don't match the claim you just made?

    link to this | view in thread ]

  31. icon
    Mike Masnick (profile), 13 May 2021 @ 10:16pm

    Re: Re:

    They're talking about this ridiculous lawsuit from an advocacy group whose stated mission is to make all pornography illegal:

    https://www.courtlistener.com/docket/31302014/doe-v-twitter-inc/

    Twitter just recently filed its motion to dismiss explaining what a garbage lawsuit this is, and (notably) tells a very different story than the idiot commenter presented above:

    https://storage.courtlistener.com/recap/gov.uscourts.cand.372231/gov.uscourts.cand.372231.48. 0.pdf

    link to this | view in thread ]

  32. icon
    Mike Masnick (profile), 13 May 2021 @ 10:18pm

    Re:

    Write a law that deals with those sites, but does not hamstring the vast majority of websites and lawful speech out there, and let's talk. So far, she has not done so.

    I'm totally open to an approach that focuses solely on those kinds of sites and does so in a Constitutional manner.

    link to this | view in thread ]

  33. identicon
    Anonymous Coward, 13 May 2021 @ 11:02pm

    Re:

    Some sites that exist solely to destroy people's reputations have managed to survive

    How's that mailing list lawsuit coming along, John Smith?

    link to this | view in thread ]

  34. icon
    PaulT (profile), 13 May 2021 @ 11:07pm

    Re: Re: Re:

    "and (notably) tells a very different story than the idiot commenter presented above"

    Yes, that much was obvious, which is why I asked for a citation earlier. I hadn't come across the story yet, and wasn't going to waste my time looking, so thanks for the details!

    Yet again, the standard holds true - if someone's making a claim that sounds suspicious, and they do so without providing any backup details, they're probably misrepresenting the story.

    link to this | view in thread ]

  35. icon
    Nepz (profile), 13 May 2021 @ 11:13pm

    Ulterior motive

    It would seem that many critics of this piece of legislation only really want to modify it to hold platforms accountable for speech that they (the critics) disagree with.
    I definitely think Trump is a threat to American democracy, but removing liability protections for online platforms is not the right way to fight misinformation. All that ends up doing is placing an unduly burden on platforms to moderate their content. Furthermore, it's not actually illegal to spread misinformation*. I have a first amendment right to go out on the street and say that I think 5G does cause COVID-19.

    *libel and defamation notwithstanding

    link to this | view in thread ]

  36. icon
    That One Guy (profile), 13 May 2021 @ 11:30pm

    Re: Re: Re:

    Well I certainly can't imagine any bias or concerns about honest presentation of the facts from a group who has stated that their goal is to get rid of all porn, and thus might have a vested interest in trying to gut a law that allows sites leeway in what they do and do not allow... That said I am shocked, shocked I say that someone attempting to argue against 230 would do so by misrepresenting the facts of a case, why that hasn't happened in at least the last 30 seconds or so!

    Sarcasm aside thanks for the links, reading through Twitter's response I'm not impressed by the accusations flung their way either as it seems to be a whole lot accusations of guilt with only assumptions and assertions as evidence to the point that I'm left wondering if this is a PR stunt(likely given who you said is filing it) or a Steve Dallas lawsuit, though whatever it is it's bad.

    link to this | view in thread ]

  37. identicon
    Anonymous Coward, 13 May 2021 @ 11:31pm

    Re: Re: Re: Re:

    Conde Nast runs Ars Technica, which... cares very much about Section 230. But you'd know this, John Smith, from their coverage of John Steele's downfall.

    link to this | view in thread ]

  38. icon
    cattress (profile), 14 May 2021 @ 1:05am

    Re: Re: What's wrong with Wired?

    I'm not trying to victim blame, as these are children with zero responsibility for what they have suffered. At the same time, I can't help but wonder why their parents or guardians didn't make them feel safe in bringing this stuff to their attention. I'm not saying the parents are at fault either, blame is fully on the predators. But there is a trend some where in those victims of some necessary point of trust, where even if threatened by the pedophile, that they were not certain that mom or dads physical presence would not have been overwhelmingly safe and secure and able to protect them from the threats. I guess I can see how threats coming from a trusted adult, in person, can be an effective intimidation tactic to silence a child; where as an internet stranger, one who has presented themselves as a child or teen, even with some of the child's personal info, like their address, the name of their pet, ect, is more frightening than the security of a physically present parent can overcome. I know these scumbags groom the kids, get details about their lives, probably target kids they suspect are vulnerable. I was a people pleaser kid, I got a stomach ache from just being asked to be quiet from an adult, but even if I screwed up, I could still go to my Mommom or Mom and they would fix (or help me fix) the problem. It just bothers me that so many kids didn't feel confident or secure in the "powers" of Mom and Dad (or a grandparent, or a sibling, someone who acted as a protector)
    Maybe I misunderstand the dynamic, but I hope I equip my daughter with the tools to stand her ground until she can get me or her Dad, who she will trust to protect her, regardless of any mistakes or blame she thinks she deserves.

    link to this | view in thread ]

  39. icon
    Toom1275 (profile), 14 May 2021 @ 2:13am

    Re:

    [Asserts facts not in evidence]

    link to this | view in thread ]

  40. icon
    techflaws (profile), 14 May 2021 @ 4:12am

    Re:

    immune from charges for refusing to take down child porn

    So, how would Twitter go about taking stuff down from other sites? Or can you point to ANY instance of Twitter hosting it?

    Yeah, thought so, genius.

    link to this | view in thread ]

  41. identicon
    Anonymous Coward, 14 May 2021 @ 6:32am

    Re:

    Simple. You ignore her and talk to people who aren't arguing in bad faith.

    ...oh wait.

    link to this | view in thread ]

  42. identicon
    Anonymous Coward, 14 May 2021 @ 7:55am

    Re: Re: Actual Malice

    People sell nudes via Twitter (and many other social media sites that have file sharing), ship them via twitter, and can take bitcoin for payment, without anyone asking for 2257 compliance. It's not unreasonable to assume that some minors haveexploited this.

    link to this | view in thread ]

  43. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 14 May 2021 @ 7:59am

    Re: Re:

    Repealing Section 230 is not unconstitutional. Other countries don't even have it (they have notice-and-takedown and somehow still exist).

    It would be up to you to suggest the alternative. Requiring affirmative proof of third-party authorship for a post to stay up would be a good start. No takedown, just require a way to find the author.

    Also even if only a "few" companies could afford the LIEability, they'd become a backbone for many millionaires. PewDiePie never started a tech company and he did just fine.

    If your ideas protect a site like 4Chan you might want to modify them.

    link to this | view in thread ]

  44. identicon
    Anonymous Coward, 14 May 2021 @ 8:02am

    So basically tech and phone companies can't regulate themselves but free-market forces can wipe out defamation.

    Platforms don't destroy people, people do.

    link to this | view in thread ]

  45. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 14 May 2021 @ 8:04am

    Re: Fractal wrongness

    So criticizing 230 because individuals are defenseless against reputation blackmail is not honest?

    Looks like 230 can do no wrong in some eyes.

    link to this | view in thread ]

  46. identicon
    DoodMonkey, 14 May 2021 @ 9:15am

    The answer to the question is

    follow the money.

    link to this | view in thread ]

  47. icon
    Mike Masnick (profile), 14 May 2021 @ 10:30am

    Re: Re: Re:

    Repealing Section 230 is not unconstitutional.

    No, it's not. But my specific request was to craft a law that actually focused on the small sliver of bad actors you described, and didn't similarly burden tons of important speech. Repealing Sectin 230 would not do that.

    Other countries don't even have it (they have notice-and-takedown and somehow still exist).

    Yes, and multiple studies of how things work in those countries show that lots of important speech gets suppressed because of that.

    It would be up to you to suggest the alternative.

    Wait. You're the one demanding change. You suggest the alternative. I'm pointing out the problems with every alternative you propose. I think that 230 as currently written mostly gets the balance correct.

    Requiring affirmative proof of third-party authorship for a post to stay up would be a good start.

    Anonymity is a key value that the 1st Amendment supports. Requiring this would violate the 1st Amendment, so we're back into unconstitutional land.

    Also even if only a "few" companies could afford the LIEability, they'd become a backbone for many millionaires. PewDiePie never started a tech company and he did just fine.

    I honestly have no clue what you're trying to say here.

    If your ideas protect a site like 4Chan you might want to modify them.

    Again, what?

    link to this | view in thread ]

  48. identicon
    Anonymous Coward, 14 May 2021 @ 11:09am

    Just something I wondered about the #MeToo movement since it was mentioned. Does it at all recognize the existence of false accusations? And if so, what does it do about it? It seems like anyone making an accusation is automatically believed, which makes it easier for a woman with an ax to grind to make life miserable for a guy she doesn't like. What does the movement do to prevent this? What does it do to weed out those who are only after money, revenge, etc. from those who've actually been harmed?

    link to this | view in thread ]

  49. icon
    Thad (profile), 14 May 2021 @ 11:10am

    Re: Re: Re:

    Right you are. I meant Conde Nast, not Ziff Davis; thanks for the correction.

    link to this | view in thread ]

  50. icon
    Toom1275 (profile), 14 May 2021 @ 12:32pm

    Re: Re: Re:

    Repealing Section 230 is not unconstitutional.

    [Asserts facts contradicted by the Constitution]

    link to this | view in thread ]

  51. identicon
    Anonymous Coward, 14 May 2021 @ 1:06pm

    Re: Re:

    Personally I favor at that point breaking out harsh ridicule and mockery to destroy the ability of the bad actor to be taken seriously. Shame their law school alma mada for letting a blatant incompetent who cannot understand Section 230 graduate, etc.

    link to this | view in thread ]

  52. icon
    Uriel-238 (profile), 14 May 2021 @ 4:15pm

    I'm assuming it's the same as NYT and WSJ

    Wired belongs to a newsmedia agency whose voice(s) get quieter so long as ordinary members of the public can tweet.

    Also people on internet forums can share fact-checks and evidence that run contrary to news-agency-highlighted opinions. Also opposing opinions that might be more convincing.

    So even before we follow the money, they are motivated to convince the legislature to do stupid shit that silences the public-access podiums.

    link to this | view in thread ]

  53. identicon
    Anonymous Coward, 14 May 2021 @ 6:19pm

    Re:

    No, duh? Thanks for finally coming around, John Smith, but considering you claimed that copyright lengths were long to prevent content creators from getting murdered, safe to say everyone knows you're still the same bitter, impotent fuckwit who thinks all women are scum for refusing to sleep with you.

    link to this | view in thread ]

  54. identicon
    Anonymous Coward, 14 May 2021 @ 6:25pm

    Re: Re: Re: Actual Malice

    You have proof of this, John Smith? Like for that press release you say is coming?

    link to this | view in thread ]

  55. identicon
    Anonymous Coward, 14 May 2021 @ 6:27pm

    Re: Re: Fractal wrongness

    The problem is that you're not honest, John Smith.

    Individuals having "no defense" against that legal gambit you keep promoting as a genius strategy is another separate issue, but if that were to happen, based on your own history of claims, you'd be first in line to be subpoenaed for making that tactic available by talking about it constantly.

    link to this | view in thread ]

  56. identicon
    Anonymous Coward, 14 May 2021 @ 6:32pm

    Re: Re: Re:

    We're sorry that your boy Trump failed to kill Section 230 despite you going down on your knees and begging him to do it. Guess you shouldn't have backed an insurrection-sparking lunatic, huh?

    Let me play your hurt feeling-weelings a little sad song on the world's smallest violin. Neee-neee nee nee nee nee nee nee neeeeeeeh...

    link to this | view in thread ]

  57. identicon
    Anonymous Coward, 14 May 2021 @ 8:32pm

    Re: Progress! used to be "women and children"

    LGBTQ people are inevitably hit first and hardest

    link to this | view in thread ]

  58. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 14 May 2021 @ 8:34pm

    Re: "preexisting" (with umlauts that don't reproduce) ??????

    Where the HELL did you get that? NO American, not even the most academic, has spelled it with umlauts for a hundred years! -- It's not uncommon, not rare, not even a possible affectation: NO American uses accented characters. It's not just extra trouble, but not correct at all in American writing!

    This is not your first ODD -- and casual -- use of accented characters as exactly NO American does, not even if proficient with HTML codes. NO American pays any attention to those, let alone has a furrin keyboard on which handy.

    And you're not the only fanboy casually using accented characters instead of NONE as standard American! This web-site has fanboys with UNIQUE writing characteristics and uniformly orthodox corporatist advocacy. -- Therefore, you are NOT an American as claim.

    link to this | view in thread ]

  59. identicon
    Anonymous Coward, 14 May 2021 @ 8:50pm

    Re: Re:

    You used to format your italics with slashes /like this/, blue. You're not in any position to bitch.

    DMCA voted.

    link to this | view in thread ]

  60. identicon
    Anonymous Coward, 14 May 2021 @ 11:24pm

    Re: Re: Re: Actual Malice

    People sell nudes via Twitter (and many other social media sites that have file sharing), ship them via twitter, and can take bitcoin for payment, without anyone asking for 2257 compliance. It's not unreasonable to assume that some minors haveexploited this.

    For the sake of argument, let's assume this is true. Is there any proof that Twitter took no action to stop this when made aware of specific cases occurring?

    You can't be general about it. Website hosting providers aren't liable for child porn if they aren't aware of its specific location on their servers, even if they say "yeah, there's probably child porn on our servers somewhere." Twitter isn't going to be held liable either unless specific cases of them being aware of it and doing nothing about it can be shown.

    link to this | view in thread ]

  61. identicon
    Anonymous Coward, 14 May 2021 @ 11:39pm

    Re: Re: Re: Re:

    Other countries don't even have it (they have notice-and-takedown and somehow still exist).

    Yes, and multiple studies of how things work in those countries show that lots of important speech gets suppressed because of that.

    Indeed. Other countries also don't have the Compuserve and Prodigy cases that say even notice-and-takedown isn't good enough; if you moderate at all, you're automatically liable for anything you leave up, even if you never actually saw it.

    In the US, repeal 230 and it forces the extremes: Remove everything (because you're liable for anything you inadvertently miss), or remove nothing at all.

    link to this | view in thread ]

  62. icon
    Samuel Abram (profile), 15 May 2021 @ 4:38am

    Re: Re: "preexisting" (with umlauts that don't reproduce) ??????

    Where the HELL did you get that? NO American, not even the most academic, has spelled it with umlauts for a hundred years! -- It's not uncommon, not rare, not even a possible affectation: NO American uses accented characters. It's not just extra trouble, but not correct at all in American writing!

    First of all, they're diaereses, not umlauts. umlauts change the pronunciation of the vowel, whereas diaereses split a diphthong.

    Second of all, you must not read the New Yorker. Their orthography has diaereses all over the place in words where two of the same vowel next to each other have different sounds, such as "coöperate", "reënactment", "preëmptive", etc.

    I thought it looked cool, so I adopted it. Apparently, so did Stephen T. Stone. If you have such a big problem with it, take it up with the New Yorker magazine.

    link to this | view in thread ]

  63. icon
    PaulT (profile), 15 May 2021 @ 10:00am

    Re: Re: Re: Re: Re:

    "Other countries also don't have the Compuserve and Prodigy cases that say even notice-and-takedown isn't good enough"

    Other countries also don't have the history and culture of litigation that encourages people to sue every innocent bystander in the hope of a large payout. Many of them already have it written into their law that you can't sue people for things they had no control over, making section 230 unnecessary there.

    link to this | view in thread ]

  64. icon
    Uriel-238 (profile), 15 May 2021 @ 1:54pm

    Americans using diaereses

    ...Also Mr. Lovecraft was fond of them.

    When metal bands use them, they're ümlauts.

    link to this | view in thread ]

  65. identicon
    Anonymous Coward, 15 May 2021 @ 7:05pm

    Re: Re: Re: Re: Actual Malice

    You can't be general about it

    The problem with copyright enforcement advocates is that "being general" about their claims is the only tactic they have in their playbook. They go to court secretly hoping the judges don't look at their evidence standards too closely. You're not going to get anything more than a "but but but Google" out of Jhon.

    link to this | view in thread ]

  66. identicon
    Glenn, 16 May 2021 @ 3:12pm

    Re:

    Still, if an LGBTQH person does it or says it--or if it's done for an LGBTQH reason, then it's considered progressive or enlightened or whatever; but if a straight person does or says the same kind of thing, then it's viewed as sexual harassment.

    link to this | view in thread ]

  67. icon
    Scary Devil Monastery (profile), 18 May 2021 @ 12:42am

    Re: Re: "preexisting" (with umlauts that don't reproduce) ??????

    "And you're not the only fanboy casually using accented characters instead of NONE as standard American! "

    I'm not too surprised the "standard american" doesn't know how to spell their own language given that the american National Center for Educational Statistics shows 21 whopping percent of americans are functionally illiterate.

    Baghdad Bob, it isn't that americans have evolved their language. It's that many of them have forgotten that they have one in the first place. And then, as you just demonstrated, those people react to any sign of erudition with chest-beating and flung feces.

    It never fails. As soon as dumb assholes get numerous enough the first thing they do is start screaming about the smart people.

    link to this | view in thread ]

  68. icon
    Scary Devil Monastery (profile), 18 May 2021 @ 12:45am

    Re: Americans using diaereses

    "...Also Mr. Lovecraft was fond of them."

    Lovecraft was one of the early giants upon whose shoulders stand people like Stephen King.

    Also a seriously racist asshole but you can't fault his writing.

    It's really ironic that the tiki-torch wielding double-digit IQ fuckwits of today would be considered "untermenschen" by all their spiritual predecessors. At least the racists in days of yore knew how to read.

    link to this | view in thread ]

  69. icon
    Scary Devil Monastery (profile), 18 May 2021 @ 1:19am

    Re: Re:

    "Still, if an LGBTQH person does it or says it--or if it's done for an LGBTQH reason, then it's considered progressive or enlightened or whatever..."

    The same way it's OK for a black man to use the N-word and for a jew to crack jewish jokes, yes. A minority has the right to repossess the slurs used against them.

    It's OK for the same reason that facepalming yourself is OK but facepalming someone else usually isn't. It's not rocket science.

    link to this | view in thread ]

  70. icon
    Scary Devil Monastery (profile), 18 May 2021 @ 1:21am

    Re: Re: Apply the old adage...

    "Religion can also have an impact, and certain political factions have been turning into religious cults recently..."

    It'd be more accurate to say that pre-existing religious cults have found common political ground rather than the other way around. The saner people in those political factions have long left.

    link to this | view in thread ]

  71. icon
    Scary Devil Monastery (profile), 18 May 2021 @ 1:33am

    Re: Re: Re: Actual Malice

    "People sell nudes via Twitter (and many other social media sites that have file sharing)"

    Even assuming this were true, what, exactly is Twitter supposed to do about it? Surveill all private one to one operations? Monitor all exchanges of private addresses?

    You guys can't have it both ways. Either 230 exists in which case Twitter will at least have a few tools at their disposal, or you resign social platforms to that "common carrier" status you keep harping about in which case Twitter can do fuck-all about anything their clients get up to.

    link to this | view in thread ]

  72. icon
    Scary Devil Monastery (profile), 18 May 2021 @ 1:37am

    Re: Re: Re: Re: Actual Malice

    "You can't be general about it. Website hosting providers aren't liable for child porn if they aren't aware of its specific location on their servers..."

    The really amusing part is where Baghdad Bob here is now arguing the exact opposite of his previous tack - that because of the existence of <undesirable behavior X> social platforms NEED to moderate and surveill their clients.

    It never fails. Leave Baghdad Bob to spout his rants long enough and he'll eventually deliver equally outrageous bad arguments for both sides of any given topic. I swear, if that guy flips a coin and gets to make two choices on which side'll be up he'll still end up losing.

    link to this | view in thread ]

  73. icon
    Scary Devil Monastery (profile), 18 May 2021 @ 1:46am

    Re: Ulterior motive

    "It would seem that many critics of this piece of legislation only really want to modify it to hold platforms accountable for speech that they (the critics) disagree with."

    Close. The biggest advocates for 230 reform are the religious, the shattered remains of the conservative part of the body politic, and the racists and bigots.

    230 poses similar problems for all of these; The first one being that to a GOP politician of today it's a terrible burden to have thousands of people fact-checking the outrageous bullshit you keep pulling and put it in context with your previous bloopers. The same holds true for democrat politicians of course, but by and large democrats don't lie quite that obviously or feed outright conspiracy nonsense or puritan zealotry.

    The second problem with 230 for these Very Fine People is that if the platform gets to moderate based on their own cognizance it means most platforms will try to accommodate the majority of their user base which doesn't want to see neo-nazis or Proud Boys present their case for white supremacy everywhere.
    Yeah, these Very Fine People can certainly go on sites like Parler, Gab or Stormfront but there's no audience there. They don't want to grab a bullhorn and a soapbox in a public square, they want the right to march into the most popular bar in town and start screaming at people without the expectation of being thrown out.

    230 is very inconvenient because it means private property owners get to choose under which rules they allow people onto their property. That sticks in the craw of people who have been using their own living rooms as a toilet and think they're owed a fresh floor in someone elses house.

    Hence why every anti-230 argument tends to be based on lies, hyperbole and false assumption.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.