Why Section 230 'Reform' Effectively Means Section 230 Repeal

from the catalog-of-bad-ideas dept

Some lawmakers are candid about their desire to repeal Section 230 entirely. Others, however, express more of an interest to try to split this baby, and "reform" it in some way to somehow magically fix all the problems with the Internet, without doing away with the whole thing and therefore the whole Internet as well. This post explores several of the types of ways they propose to change the statute, ostensibly without outright repealing it.

And several of the reasons why each proposed change might as well be an outright repeal, given each one's practical effect.

But before getting into the specifics about why each type of change is bad, it is important to recognize the big reason why just about every proposal to change Section 230, even just a little bit, undermines it to the point of uselessness: because if you have to litigate whether Section 230 applies to you, you might as well not have it on the books in the first place. Which is why there's really no such thing as a small change, because if your change in any way puts that protection in doubt, it has the same debilitating effect on online platform services as an actual repeal would have.

This was a key point we keep coming back to, including in suggesting that Section 230 operates more as a rule of civil procedure than any sort of affirmative subsidy (as it is often mistakenly accused of being). Section 230 does not do much that the First Amendment would not itself do to protect platforms. But the crippling expense of having to assert one's First Amendment rights in court, and potentially at an unimaginable scale given all the user-generated content Internet platforms facilitate, means that this First Amendment protection is functionally illusory if there's not a mechanism to get platforms out of litigation early and cheaply. It is the job of Section 230 to make sure they can, and that they won't have to worry about being bled dry in legal costs having to defend themselves even where, legally, they have a defense.

Without Section 230 their only choice would be to not engage in the activity that Section 230 explicitly encourages: intermediating third party content, and moderating it. If they don't moderate it then their services may become a cesspool, but if the choice they face is either to moderate, or to potentially be bankrupted in litigation (or even, as in the case of FOSTA, potentially prosecuted), then they won't. And as for intermediating content, if they can get into legal trouble for allowing the wrong content, then they will either host less user-generated content, or not be in the business of hosting any user content at all. Because if they don't make these choices, they set themselves up to be crushed by litigation.

Which is why it is not even the issue of ultimate liability that makes lawsuits such an existential threat to an Internet platform. It's just as bad if the lawsuit that crushes them is over whether they were entitled to the statutory liability protection needed to avoid the lawsuit entirely. And we know lawsuits can have that annihilating effect when platforms are forced to litigate these questions. One conspicuous example is Veoh Networks, a video-hosting service who today should still be a competitor to YouTube. But it isn't a competitor because it is no longer a going concern. It was obliterated by the costs of defending its entitlement to assert the more conditional DMCA safe harbor defense, even though it won! The Ninth Circuit found the platform should have been protected. But by then it was too late; the company had been run out of business, and YouTube lost a competitor that, today, the marketplace still misses.

It would therefore be foolhardy and antithetical to lawmakers' professed interest in having a diverse ecosystem of Internet services were they to do anything to make Section 230 similarly conditional, thereby risking even further market consolidation than we already have. But that's the terrible future that all these proposals tempt.

More specifically, here's why each type of proposal is so infirm:

Liability carve-outs. One way lawmakers propose to change Section 230 is to deny its protection to specific forms of liability that may arise in user content. A variety of these liability carve-outs have been proposed, and all require further scrutiny. For instance, one popular carve-out with lawmakers is trying to make Section 230 useless against claims of liability for posts that allegedly violate anti-discrimination laws. But while on first glace such a carve-out may seem innocuous, we know that it's not. And one way it's not is because people eager to discriminate themselves have shown themselves keen to try to force platforms to help them do it, including by claiming that anti-discrimination laws serve to protect their own efforts to discriminate. So far they have largely been unable to conscript platforms into enabling their hate, but if Section 230 no longer protects platforms from these forms of liability, then racists will finally be able to succeed by exploiting that gap.

These carve-outs also run the risk of making it harder for people who have been discriminated against from finding a place to speak out about it, since it will force platforms to be less willing to offer space to speech that they might find themselves forced to defend, because even if the speech were defensible just having to answer for it can be ruinous for the platform. We know that they will feel forced to turn away all sorts of worthy and lawful speech if that's what they need to do to protect themselves, because we've seen this dynamic play out as a result of the few carve-outs Section 230 has had from the start. For example, if the thing wrong with the user expression was that it implicated an intellectual property right, then Section 230 didn't protect the platform from liability in their users' content. Now, it turns out that platforms have some liability protection via the DMCA, but this protection is weaker and more conditional than Section 230, which is why we see all the swiss cheese online with videos and other content so often removed – even in cases when they were not actually infringing – because taking it down is the only way platforms can avoid trouble and not run the risk of going the way of Veoh Networks themselves.

Such an outcome is not good for encouraging free expression online, which was a main driver behind passing Section 230 originally, and it isn't even good for the people these carve outs were ostensibly intended to help, which we saw with FOSTA, which was an additional liability carve-out more recently added. With FOSTA, instead of protecting people from sexual exploitation, it led to platforms taking away their platform access, which drove them into the streets, where they got hurt or killed. And, of course, it also led to other perfectly lawful content disappearing from the Internet, like online dating and massage therapy ads, since FOSTA had made it impossibly risky for the platforms to continue to facilitate it.

It's already a big problem that there are even just these liability carve-outs. If Section 230 were to be changed in any way, it should be changed to remove them. But in any case, we certainly shouldn't be making any more if Section 230 is still to maintain any utility in protecting the platforms we need to facilitate online user expression.

Transactional speech carve-outs. As described above, one way lawmakers are proposing to change Section 230 is to carve out certain types of liability that might attach to user-generated content. Another way is to try to carve out certain types of user expression itself. And one specific type of user expression in lawmakers' crosshairs (and also some courts') is transactional speech.

The problem with this invented exception to Section 230 is that transactional speech is still speech. "I have a home to rent" is speech, regardless of whether it appears on a specialized platform that only hosts such offers, or more general purpose platforms like Craigslist or even Twitter where such posts are just some of the kinds of user expression enabled.

Lawmakers seem to be getting befuddled by the fact that some of the more specialized platforms may earn their money through a share of any consummated transaction their user expression might lead to, as if this form of monetization were somehow meaningfully distinct from any other monetization model, or otherwise somehow waived their First Amendment right to do what basically amounts to moderating speech to the point where it is the only type of user content they allow. And it is this apparent befuddlement that has led to attempts by lawmakers to tie Section 230 protection to certain monetization models and go so far as to eliminate it for certain ones.

Even these proposals were carefully drafted such proposals they would only end up chilling e-commerce by forcing platforms to use less-viable monetization models. But what's worse is that the current proposals are not being carefully drafted, and so we end up seeing bills end up threatening the Section 230 protection of any platform with any sort of profit model. Which, naturally, they all need to have in some way. After all, even non-profit platforms need some sort of income stream to keep the lights on, but proposals like these threaten to make it all but impossible to have the money needed for any platform to operate.

Mandatory transparency report demands. As we've discussed before, it's good for platforms to try to be candid about their moderation decisions and especially about what pressures forced them to make these decisions, like subpoenas and takedown demands, because it helps highlight when these instruments are being abused. Such reports are therefore a good thing to encourage.

But encouragement is one thing; requiring them is another, but that's what certain proposals try to do in conditioning Section 230 protection to the publication of these reports. And they are all a problem. Making transparency reports mandatory is an unconstitutional form of compelled speech. Platforms have the First Amendment right to be arbitrary in their moderation practices. We may prefer them to make more reasoned and principled decisions, but it is their right not to. But they can't enjoy that right if they are forced to explain every decision they've made. Even if they wanted to, it may be impossible, because content moderation is happening at scale, which inherently means it will never be perfect, and it also may be ill-advised to be fully transparent because it teaches bad actors how to game their systems.

Obviously a platform could still refuse to produce the reports as these bills would prescribe. But if that decision risks the statutory protection the platform depends on to survive, then it is not really much of a decision. It finds itself compelled to speak in the way that the government requires, which is not constitutional. And it also would end up impinging on that freedom to moderate, which both the First Amendment and Section 230 itself protect.

Mandatory moderation demands. But it isn't just transparency in moderation decisions that lawmakers want. Some legislators are running straight into the heart of the First Amendment and demanding that they get to dictate how platforms get to do any of their moderation by conditioning Section 230 protection to the platforms making these decisions the way the government insists.

These proposals tend to come in two political flavors. While they are generally utterly irreconcilable – it would be impossible for any platform to simultaneously satisfy both of them at the same time – they each boil down to the same unconstitutional demand.

Some of these proposals reflect legislative outrage at platforms for some of the moderation decisions they've made. Usually they condemn platforms for having removed certain speech or even banned certain speakers, regardless of how poor their behavior or how harmful the things those speakers said. This condemnation leads lawmakers who favor these speakers and their speech to want to take away the platforms' right to make these sorts of moderation decisions by, again, conditioning Section 230 on their continuing to leave these speakers and speech up on these systems. The goal with these proposals is to set up the situation where it is impossible for platforms to continue to exercise their First Amendment discretion in moderation and possibly take them down, lest they lose the protection they depend on to exist. Which is not only unconstitutional compulsion, but also itself ultimately voids the part of Section 230 that expressly protects that discretion, since it's discretion that platforms can no longer exercise.

On the flip side, instead of conditioning Section 230 on not removing speakers or speech, other lawmakers would like to condition Section 230 on requiring platforms to kick off certain speakers and speech (and sometimes even the same ones that the other proposals are trying to keep up). Which is just as bad as the other set of proposals, for all the same reasons. Platforms have the constitutional right to make these moderation choices however they choose, and the government does not have the right, per the First Amendment, to force them to make them in any particular way. But if their critical Section 230 protection can be taken away if they don't moderate however the sitting political power demands at the moment, then that right has been impinged and Section 230 rendered a nullity.

Algorithmic display carve-outs. Algorithmic display has become a target for many lawmakers eager to take a run at Section 230. But as with every other proposed reform, changing Section 230 so that it no longer applies to platforms using algorithmic display would end up obliterating the statute for just about everyone. And it's not quite clear that lawmakers proposing these sorts of changes quite realize this inevitable impact.

And part of the problem seems to be that they don't really understand what an algorithm is, or how commonly they are used. They seem to regard it as something nefarious, but there's nothing about an algorithm that inherently is. The reality is that nearly every platform uses software in some way to handle the display of user-provided content, and algorithms are just the programming logic coded into the software giving it the instructions for how to display that content. Moreover, these instructions can even be as simple as telling the software to display the content chronologically, alphabetically, or some other relevant way the platform has decided to render content, which the First Amendment protects. After all, a bookstore can decide to shelve books however it wants, including in whatever order or with whatever prominence it wants. What these algorithms do is implement these sorts of shelving decisions, just as applied to the online content a platform displays.

If algorithms were to end up banned by making the Section 230 protection platforms need to host user-generated content contingent on not using them, it would make it impossible for platforms to actually render any of that content. They either couldn't do it technically, if they were to abide by this rule withholding their Section 230 protection, or legally if that protection were to be withheld because they used this display. Such a rule would also represent a fairly significant change to Section 230 itself by gutting the protection for moderation decisions, since those decisions are often implemented by an algorithm. In any case, conditioning Section 230 on not using algorithms is not a small change but one that would radically upend the statutory protection and all the online services it enables.

Terms of Service carve-outs. One idea (which is, oddly, backed by Facebook, even though it needs Section 230 to remain robust in order to defeat litigation like this) is that Section 230 protection should be contingent on platforms upholding their terms of service. As with these other proposals, this one is also a bad idea.

First of all, it negates the utility of Section 230 protection by making its applicability the subject of litigation. In other words, instead of being protected from litigation, platforms will now have to litigate whether they are protected from litigation, which means they aren't really protected at all.

It also fails to understand what terms of service are for. Platforms have them in order to limit their liability exposure. There's no way that they are going to write them in a way that has the effect of increasing their liability exposure.

The way they are generally written now is to put potentially wayward users on notice that if they don't act consistently with these terms of service, the service may be denied them. They aren't written to be affirmative promises to do anything because they can't be affirmative promises – content moderation at scale is impossible to do perfectly, so it would be foolish for platforms to obligate themselves to do the impossible. But that's what changing Section 230 in this way would do, create this obligation if platforms are to retain their needed protection.

This pipe dream that some seem to have, that if only platforms did more moderation in accordance with their terms of service as currently written, everything would be perfect and wonderful is hopelessly naïve. After all, nothing about how the Internet works is nearly that simple. Nevertheless, it is fine to want platforms to do as much as they can to meet the aspirational goals they've articulated in their terms of service. But changing Section 230 in this way won't lead them to. Instead it will make it legally unsafe for platforms to even articulate any such aspirations and thus less likely to meet any of them. Which means that regulators won't get more of what they seek with this sort of proposal, but less.

Pre-emption elimination. One of the key clauses that makes Section 230 useful is its pre-emption provision. This is the provision that tells states that they cannot rejigger their own state laws in ways that would interfere with the operation of Section 230. The reason it is so important is because it gives the platforms the certainty they need to be able to benefit from the statute's protection. For it to be useful they need to know that it applies to them and that states have no ability to mess with it.

Unfortunately we are already seeing increasing problems with state and local jurisdictions attempting to ignore this pre-emption provision, and courts even sometimes letting them. But on top of that there are proposals in Congress to deliberately undermine it. In fact, with FOSTA, it already has been undermined, with individual state governments now able to impose liability directly on platforms for their user activity, no matter how arbitrarily.

We see with the moderation bills an illustration of what is wrong with states getting to mess with Section 230 and make its protection suddenly conditional – and therefore effectively useless. Given our current political polarity, the problem should be obvious: how is any platform going to reconcile the moderation demands of a Red State with the moderation demands of a Blue State? What is an inherently interstate Internet platform to do? Whose rules should they follow? What happens to them if they don't?

Congress put in the pre-emption provision because it knew that platforms could not possibly comply with all the myriad rules and regulations that every state, county, city, town, and locality might develop to impose liability on platforms. So it told them all to butt out. It's a mistake to now gut that provision if Section 230 is going to still have any value in making it safe for platforms to continue to do their job enabling the Internet.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: carve outs, content moderation, free speech, intermediary liability, reform, repeal, section 230, transparency


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 12 Oct 2021 @ 11:21am

    It seems to me that everybody who wants to change or eliminate 230 does not support free speech, but is rather seeking the means to force the Internet to reflect their political views, and only their political views.

    link to this | view in chronology ]

  • icon
    James Burkhardt (profile), 12 Oct 2021 @ 11:25am

    Cathy, loving the breakdown here.

    In the "Transactional speech carve-outs" section, you end with:

    But what's worse is that the current proposals are not being carefully drafted, and so we end up seeing bills end up threatening the Section 230 protection of any platform with any sort of profit model. Which, naturally, they all need to have in some way. After all, even non-profit platforms need some sort of income stream to keep the lights on, but proposals like these threaten to make it all but impossible to have the money needed for any platform to operate.

    the use of the highlighted profit is misleading. The word to use here is revenue. This helps remind people that profit is different from revenue and that a non-profit doesn't have $0 revenue, it just is not intended to seek revenue in excess of expenses. I like the work as a whole, but that took me out hard as I was reading as I tried to parse what you were actually trying to say.

    link to this | view in chronology ]

    • icon
      Cathy Gellis (profile), 12 Oct 2021 @ 11:52am

      Re:

      I meant literally "non-profit" organizations like Wikimedia, as opposed to for-profit commercial enterprises like Twitter.

      link to this | view in chronology ]

      • icon
        James Burkhardt (profile), 12 Oct 2021 @ 11:56am

        Re: Re:

        I know. My issue wasn't the word non-profit, it was the word profit in the sentance, i screwed up my markdown and did not bold that one.

        I was not admonishing you to for the use of "non-revenue platforms", a non-profit platform is the correct term. i was admonishing the use of 'profit model', when the word to use is revenue. Not everyone has a profit model, everyone that takes in money has a revenue model.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 12 Oct 2021 @ 11:40am

    Thank you for a clear listing of the problems with each of the so-far proposed types of modification to section 230.

    We can only hope that, when deployed against these proposed bills, the politicians proposing the bills don't respond with "I don't care, it gets me votes."

    link to this | view in chronology ]

    • icon
      sumgai (profile), 12 Oct 2021 @ 8:16pm

      Re:

      ... the politicians proposing the bills don't respond with "I don't care, it gets me votes.

      Good luck with that one!

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    Koby (profile), 12 Oct 2021 @ 12:02pm

    Sign Of Growth

    But the crippling expense of having to assert one's First Amendment rights in court, and potentially at an unimaginable scale given all the user-generated content Internet platforms facilitate

    Numerous other industries have had to go through this same process. The scale doesn't matter. Everyone has a vehicle in their driveway. Everyone has a credit card in their wallet. Companies faced the prospect of class action lawsuits from millions of customers. In many cases, the industries helped shape the laws that would govern their product. Others fought serial litigants in court to establish precedent. But it wasn't easy. However, it did result in a more predictable product, one with which customers are more satisfied. Section 230 could use a lemon law.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 12 Oct 2021 @ 12:06pm

      Re: Sign Of Growth

      However, it did result in a more predictable product,

      That happens when people more or less agree on the product. When the objective is to gain political advantage by controlling moderation, there is no consensus to be had.

      link to this | view in chronology ]

    • icon
      James Burkhardt (profile), 12 Oct 2021 @ 12:08pm

      Re: Sign Of Growth

      Vehicle manufacturers are not liable for the use a car is put to. Nor are they required to sell cars to anyone who walks on the lot. They are only responsible for failures caused by their own actions. Section 230 replicates that level of liability.

      Credit cards banks are not liable for the misuse of the credit card, only their own malfeasance. Section 230 Replicates this level of liability

      Thank you for higlighting that section 230 does not provide special immunity.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 12 Oct 2021 @ 12:24pm

      Re: Sign Of Growth

      Hi Koby,

      Due to repeated missed assignments you will fail this class unless you start doing extra credit assignments. Starting with TWO peer reviewed studies on the so called "Ferguson Effect" and an essay about how the First Amendment applies only the the government and not private individuals.

      link to this | view in chronology ]

    • icon
      PaulT (profile), 12 Oct 2021 @ 1:00pm

      Re: Sign Of Growth

      "Everyone has a vehicle in their driveway"

      ...which they have to go through a process to register and legally own before they can have it there.

      "Everyone has a credit card in their wallet"

      ...which they have to apply for and be approved for before they can have it in their wallet.

      "Companies faced the prospect of class action lawsuits from millions of customers."

      ...which your dumb ass tries to insist increases exponentially beyond their control because your Klan buddies can't accept some people as being equal human beings.

      link to this | view in chronology ]

    • icon
      nasch (profile), 12 Oct 2021 @ 2:31pm

      Re: Sign Of Growth

      Section 230 could use a lemon law.

      You are already free to return your social media accounts for a full refund of the entire purchase price.

      link to this | view in chronology ]

      • icon
        That One Guy (profile), 12 Oct 2021 @ 3:24pm

        Re: Re: Sign Of Growth

        Based upon past examples I'm sure any 'refund requests' and the complaints attached to them would make for hilarious reading as those complaining either got really vague or instead made clear that the penalty they got was more than justified.

        link to this | view in chronology ]

      • identicon
        Gumnos, 18 Oct 2021 @ 8:39am

        Re: Re: Sign Of Growth

        link to this | view in chronology ]

        • icon
          Gumnos (profile), 18 Oct 2021 @ 8:42am

          Re: Re: Re: Sign Of Growth

          dagnabbit, no way to delete this? (weird inline login UI submitted the comment before I'd actually gotten a chance to type it)

          link to this | view in chronology ]

          • icon
            PaulT (profile), 18 Oct 2021 @ 1:06pm

            Re: Re: Re: Re: Sign Of Growth

            No, the level of intellectual dishonesty from certain people here means that tool that can be easily abused are not well suited to being made available, but for accidents like this it's clear that it is what it is and people who made honest mistakes do what you did, shrug and carry on.

            link to this | view in chronology ]

      • icon
        Gumnos (profile), 18 Oct 2021 @ 8:41am

        Re: Re: Sign Of Growth

        Though my understanding is that I paid with my data—usage patterns, friend-network information, posted content others ostensibly want to read, etc.

        Does this mean I can get them to give me all that data back and not keep any of it?

        link to this | view in chronology ]

    • identicon
      Anonymous Coward, 12 Oct 2021 @ 6:32pm

      Re: Sign Of Growth

      Here is your daily reminder that you have no fucking clue how section 230 works.

      Remember that time that you thought Facebook could use section 230 to dismiss a lawsuit against Facebook's own speech?

      I do:

      Instead, they will seek a dismissal based on grounds that their speech did not reach the level of actual malice, or perhaps 230.

      You really do suck at this, just saying...

      link to this | view in chronology ]

      • icon
        That One Guy (profile), 12 Oct 2021 @ 8:33pm

        Re: Re: Sign Of Growth

        Here is your daily reminder that you have no fucking clue how section 230 works.

        It is difficult to get a person to understand something when their entire argument/position depends upon their dishonest choice to pretend not to.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 12 Oct 2021 @ 12:03pm

    if there is something that can be done that means restricting access to the internet for ordinary people, while at the same time handing to certain industries the right to prevent ordinary people from having that access (unless they pay, of course) it is going to happen. what makes this so scarey is that those who are plying for 230 to be 'reformed' are politicians who are all in the pay clutches of the industries that want to take this internet control, but no one of any consequence can see it or they dont want to see it or dont give a fuck anyway! but once the net access has been lost to us, once it has been put under the control of these industries (and they are the ENTERTAINMENT INDUSTRIES, pure and simple) it wont ever come back!

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 12 Oct 2021 @ 12:15pm

      Re:

      Go and read about the events leading to the reformation, where entrenched powers, the Church and aristocracy tried to control what was printed, and eventually failed, although it took several wars of persecution to break that power.

      The invention of the printing press is the nearest historical event to the Invention of the Internet. It enabled one to many communication, and radio and television are just faster ways of implementing one to many communications, where those who control the presses and studios decide what get widespread distribution. The internet is revolutionary in that it enables many to many communications, and a step change in communication ability like the step change from letters to the printed book.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 12 Oct 2021 @ 12:48pm

        Re: Re:

        The internet is very complicated and likely very hard to control, it would be very hard for certain industries to prevent ordinary people from having that access and then force them to pay for it (many already pay there ISP for access anyway)

        link to this | view in chronology ]

    • identicon
      Anonymous Coward, 12 Oct 2021 @ 12:42pm

      Re:

      Thing is it may not happen at all becasue no one can agree on how to reform or repeal Section 230 atleast right now and its unlikely the internet will be put under control of these industries.

      link to this | view in chronology ]

    • icon
      simality (profile), 14 Oct 2021 @ 9:04pm

      Re:

      Exactly. AOL was a major proponent of the Child Online Privacy Protection Act. This law led to the shutdown of pretty much every single children's website almost overnight. You know what didn't get shut down? The AOL Kids area. Because the law was drafted in such a way that it didn't apply to the AOL Kids area. :-/

      link to this | view in chronology ]

  • icon
    That One Guy (profile), 12 Oct 2021 @ 12:56pm

    'I'm not cutting down the tree, I'm just removing all the roots'

    I have no doubt that for a good many politicians gunning for 230 the fact that reform is the functional equivalent of repeal is seen as a feature, not a bug, as it allows them to effectively kill the law without having to admit or defend that they've trying to do so.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 12 Oct 2021 @ 1:24pm

    Actual Section 230 Reform has another name

    It is called repealing abortions like FOSTA. No wait, informed people actually want abortions.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 16 Oct 2021 @ 9:50am

      Re: Actual Section 230 Reform has another name

      No, only people who think convenience is more important than human life and can't be bothered with responsibility.

      link to this | view in chronology ]

  • icon
    ECA (profile), 12 Oct 2021 @ 1:38pm

    So

    Take a simple Short law/regulation.
    Backdoor it, compromise it, Rip it apart and what do you get?

    Pages and pages of BS to sort thru that only a lawyer MIGHT be able to Fathom.
    But can we take the 1st amendment and Back it up. That NO ONE is responsible for what another person or company SAYS OR DOES.
    Allot of this tends to be 1 simple fact. REAL NAMES and your personal info or where you live, so the corps can find you. And if they find you dont own anything, that they can SUE someone that has money.

    The worst part of all this is WHO is responsible and WHO pays, on the internet. NOT for all the other corps out there that SCREW us every day.

    link to this | view in chronology ]

  • icon
    freelunch (profile), 12 Oct 2021 @ 2:15pm

    fortunately

    in this instance, there is no political consensus to pass any specific reform, since the near consensus that Section 230 creates "a problem" hides fundamental disagreements about what the problem might be, with such common gripes as "too much disinformation" and "they are censoring discriminatorily" calling for changes in opposite directions.

    Thank you for this insightful article, Ms. Gellis.

    link to this | view in chronology ]

  • icon
    Vermont IP Lawyer (profile), 12 Oct 2021 @ 3:50pm

    Mea Culpa

    Severeal months ago, I posted a comment to a different article in which (here comes the mea culpa) I suggested that the community of people who read and post to Techdirt were extremely qualified to respond to Section 230 criticisms with possible improvements to Sec. 230. My post was more or less uniformly condemned by this community (sometimes not in the politest terms). Many of the comments on my post suggested that I must hate free speech and/or Sec. 230 and/or have a political agenda. That was not and is not the case--I may not be as much of a 1st Amendment "absolutist" as some of those who post here but I lean strongly in that direction.

    (For example, I disagree with this assertion in the very first comment on Cathy's post: "everybody who wants to change or eliminate 230 does not support free speech, but is rather seeking the means to force the Internet to reflect their political views, and only their political views." More accurate if it changed "everybody" to "most.")

    So, with that introduction, let me say that I REALLY like Cathy's explanation of the defects in a wide variety of proposals for amendments to Sec. 230. I agree 100% with the key point that Sec. 230 does not provide a new substantive right but is, rather, a critical civil procedure optimization of what the 1st Amendment would provide for defendants with deep enough pockets.

    In an ideal universe, where everyone with a view on this domain, understood Cathy's point, and was operating in good faith, maybe we could agree on an improved Sec. 230. But, regrettably, it is clear that in the current real world, any proposed amendment to Sec. 230 will really be designed to further a political agenda and degrade a key constitutional right and, therefore, worthy of condemnation.

    link to this | view in chronology ]

    • icon
      nasch (profile), 12 Oct 2021 @ 4:40pm

      Re: Mea Culpa

      maybe we could agree on an improved Sec. 230.

      Improved how? First you would have to make the case that it is in need of improvement. I haven't even seen that case made successfully, let alone any coherent proposal to "fix" it.

      link to this | view in chronology ]

      • icon
        Vermont IP Lawyer (profile), 12 Oct 2021 @ 5:12pm

        Re: Re: Mea Culpa

        Comment slightly missing my point. If I had some brilliant idea for an improvement, I'd say what it was. In my earlier comment, I was just wondering whether this community might come up with some alternatives to "don't try to fix it; leave it alone." Repeating myself, in an ideal world, we could all have polite debate about what that might be. But, in the real world, as convincingly explained by Cathy, the winner is "don't try to fix it; leave it alone."

        link to this | view in chronology ]

        • icon
          That One Guy (profile), 12 Oct 2021 @ 5:44pm

          Re: Re: Re: Mea Culpa

          'Ideal world' or not the premise still seems to be based on the idea that there's something that needs to be fixed and similar to Nasch I've yet to see that argument be presented in any convincing or even persuasive way.

          You could(and I have) argue that 230 shouldn't be needed because the legal system should recognize that just like it would be absurd to blame someone who sold a car if the one they sold it to got drunk and hit someone it's equally absurd to blame the platform if a user misuses it but that's a different argument then arguing whether it needs to be 'fixed' or 'improved' in the legal landscape we do have, and in that landscape it seems to work just fine.

          link to this | view in chronology ]

        • icon
          Strawb (profile), 13 Oct 2021 @ 1:14am

          Re: Re: Re: Mea Culpa

          I was just wondering whether this community might come up with some alternatives to "don't try to fix it; leave it alone."

          But again, what is the case for it being a good idea to come up with an alternative?

          link to this | view in chronology ]

        • icon
          PaulT (profile), 18 Oct 2021 @ 1:07pm

          Re: Re: Re: Mea Culpa

          "I was just wondering whether this community might come up with some alternatives to "don't try to fix it; leave it alone."

          People are generally open to suggestions that don't completely violate the issue that it fixes or make things fundamentally worse for everyone. Do you have such a suggestion?

          link to this | view in chronology ]

    • icon
      Toom1275 (profile), 12 Oct 2021 @ 8:03pm

      Re: Mea Culpa

      Do you mean this comment
      https://www.techdirt.com/articles/20210128/17043346145/no-revoking-section-230-would-not-sav e-democracy.shtml#c832

      Where you claimed compelled speech would be an "improvement" to Section 230?

      link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 13 Oct 2021 @ 6:25am

      Re: Mea Culpa

      "But, regrettably, it is clear that in the current real world, any proposed amendment to Sec. 230 will really be designed to further a political agenda and degrade a key constitutional right and, therefore, worthy of condemnation."

      This, in essence, is the important bit. Once you've compromised on a principle everything else becomes a matter of scale - which will always tilt in the due direction of vested interests who have a lot of money and/or political power riding on the outcome.

      In short we can't have nice things because the Powers That Be consists of too many shady grifters. A natural outcome of an election system always aimed at individuals rather than parties and a first-past-the-post system where the winner takes all and politics are naturally skewed towards extremes.

      230 works well enough because it is simple. You literally can't change a single word without it breaking completely.

      link to this | view in chronology ]

  • identicon
    Glenn, 12 Oct 2021 @ 3:52pm

    The rallying cry of 230 "reformers" is: "Free Speech is not for everyone!" (or "I'm OK, but you're a problem")

    link to this | view in chronology ]

  • icon
    Blake C. Stacey (profile), 12 Oct 2021 @ 3:56pm

    Is there a bill tracker for all these different bad proposals? I can't keep them straight in my head any longer.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 12 Oct 2021 @ 4:02pm

    Put simply if you need to go to court to show section 230 protects moderation choices or the freedom to block or remove users who break the rules then many small websites will shut Down or remove all ability to comment or make a post eg minority voices will be silenced only big services like Facebook will be left for ordinary people to use for political discussions
    The whole point of section 230 is to stop expensive legal cases made by trolls or extremist users also some users will bring random legal cases in the hope of closing down the website or service

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 13 Oct 2021 @ 1:47pm

    When you become powerful enough, you can write the rules. Sec 230 will be changed, it is just a matter of time.

    link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 14 Oct 2021 @ 5:01am

      Re:

      "Sec 230 will be changed, it is just a matter of time."

      At which point the US online environment collapses to one or two monopolies and not much else, leaving China to overrun that sector as well.

      This is how the romans must have felt in the last days of their empire.

      link to this | view in chronology ]

    • icon
      PaulT (profile), 18 Oct 2021 @ 1:09pm

      Re:

      "Sec 230 will be changed, it is just a matter of time."

      Then it's a matter of whether those changes do something to make things better, or a hell of a lot worse. People are generally open to the former, but the suggestions as to how to actually improve a rule that essentially says "prosecute the person who did a thing rather than the most convenient bystander" are thin on the ground.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.