Forcing Internet Platforms To Police Content Will Never Work

from the it's-never-enough dept

For many years now, we've pointed out that whenever people -- generally legacy content companies and politicians -- started pushing for internet platforms like Google, Facebook and Twitter to "police" content, that no matter what those platforms did, it was never going to be enough. For example, after years of resisting, Google finally caved to the demands of the MPAA and the RIAA and started using DMCA notices as a signal for its ranking mechanism. This was the first time that Google ever allowed outside actors to directly have some level of control in how Google ranked them in organic search. And in doing so, we feared two things would happen: (1) it would just encourage others to start demanding similar powers over Google and (2) even those to whom Google caved would complain that the company wasn't doing enough. Indeed, that's exactly what happened.

With that in mind, it was great to see UK lawyer Graham Smith (of the excellent Cyberleagle blog, where he regularly opines on issues related to attempts to regulate internet platforms) recently come up with a simple set of rules for how this works, which he dubbed the Three Laws of Internet Intermediaries (might need some marketing polish on the name...).

Note: the first and the last ones are basically identical to the concerns I raised (though stated more succinctly). But the middle one may be the most interesting one here and worth exploring. Every time we've seen internet platforms agree to start "policing" or "moderating" or "filtering" content it fails. Often miserably. Sometimes hilariously. And people wonder "how the hell could this happen." How could YouTube -- pressured to stop terrorist content from appearing, take down evidence of war crimes instead? How could Facebook -- pressured to stop harassment and abuse -- leave that content up, but silence those reporting harassment and abuse?

We've argued, many times, that much of the problem is an issue of scale. Most people have no idea just how many of these kinds of decisions platforms are forced to make in a never-ending stream of demands to "do something." And even when they hire lots of people, actually sorting through this stuff to understand the context takes time, knowledge, empathy and perspective. It's impossible to do that with any amount of speed -- and it's basically impossible to find enough people who want to dig through such context, with the skills necessary to make the right decisions most of the time. And, of course, that assumes that there even is a "right decision" -- when, more often than not, there's a very fuzzy gray area.

And, thus, the end result is that these platforms get pressured into "doing something" -- which is never enough. And then more and more people come out of the woodwork, demanding that more be done (or similar things be done on their behalf). And then the platforms make mistakes. Many, many mistakes. Mistakes that look absolutely ridiculous when put into context -- without recognizing those who made the decisions were unlikely to know the context, nor have any realistic way of gaining that context. And that leads to that second law that Graham points out: not only is it never enough, and not only do more and more people demand things be done for them, but on top of all that, people will get mad about the total lack of "accountability" and "transparency" in how these decisions are made.

Now, I'd agree that many platforms could be much more transparent about how these decisions are made, but then that creates another corollary to these rules: which is that the more transparent and accountable you are, (1) the more people game the system and make things worse and (2) the angrier people are that "bad stuff" hasn't disappeared.

At a time when the clamor for mandatory content moderation on the internet seems to be reaching a fever pitch, we should be careful what we wish for. It won't work well and the end results may make just about everything worse.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: cda 230, graham smith, intermediary liability, section 230


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • This comment has been flagged by the community. Click here to show it
    identicon
    Personanongrata, 26 Oct 2017 @ 2:36pm

    And, thus, the end result is that these platforms get pressured into "doing something" -- which is never enough.

    Are these the same platforms that spend tremendous sums of dollars lobbying government to weaken anti-trust laws, tilt the economic landscape their favor and stealth author state/federal legislation?

    these platforms may pretend to push back against US government overreach but at the end of the day they are willing and able accomplices.

    these platforms and government are together one fascist entity.

    How the CIA made Google

    As our governments push to increase their powers, INSURGE INTELLIGENCE can now reveal the vast extent to which the US intelligence community is implicated in nurturing the web platforms we know today, for the precise purpose of utilizing the technology as a mechanism to fight global ‘information war’ — a war to legitimize the power of the few over the rest of us. The lynchpin of this story is the corporation that in many ways defines the 21st century with its unobtrusive omnipresence: Google.

    https://medium.com/insurge-intelligence/how-the-cia-made-google-e836451a959e

    Just as there is no censorship in your local hometown public square their should be no censorship of the intertubes.

    Let a persons speech stand on it's own accord.

    Government/platform censorship is nothing more than petty authoritarian tyrants determining what is and is not suitable for you to read, listen or watch.

    What has advanced humanities collective well being (ie standard of living) is the free and open exchange of ideas/information that are debated in the public square where they may stand or fall on their own merits.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Oct 2017 @ 4:06pm

      Re:

      This is it. Any person, company or government that is willing to amass excessive amounts of power, money, decision-making, etc., (always at the expense of the power, money, rights and freedoms of others) is evil. Simple as that.

      If you are one of those that would love to have 80 billion dollars and live the life, it means you are willing to take away 80 billion away from the hands of many (mostly desperate people), just for your sole selfish wishes and desires.

      link to this | view in chronology ]

    • identicon
      Thad, 26 Oct 2017 @ 4:07pm

      Re:

      Flagged.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Oct 2017 @ 3:02pm

    If they can't manage their platforms...

    ...then they should turn them off.

    Responsible, ethical, competent, diligent system admins know EVERYTHING that their systems are doing. They design for that, they build for that, they operate for that. They simply don't allow bots and spam and all the other myriad forms of abuse. (And if those things happen anyway? They're quickly crushed out of existence and changes are made to avoid a repeat.)

    These are not the kind of people running Facebook or Twitter or the rest. Those are incompetent, ignorant newbies who have absolutely no business running a single server - let alone a massive enterprise. They're simply not good enough. They don't have what it takes -- not the knowledge, not the experience, and maybe more importantly, the professionalism. They've built something that they don't know how to run and plugged it into our Internet.

    The best thing for the Internet would be for these operations to cease existing. They're malignant cancers and unfortunately they're metastasizing.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Oct 2017 @ 3:15pm

      Re: If they can't manage their platforms...

      I take it you've never been a sysadmin.

      link to this | view in chronology ]

      • This comment has been flagged by the community. Click here to show it
        identicon
        Anonymous Coward, 26 Oct 2017 @ 4:09pm

        Re: Re: If they can't manage their platforms...

        I take it you have never been a doctor. Or a farmer. What is your point? Too difficult to be a sysadmin? Then GTFO!

        link to this | view in chronology ]

        • This comment has been flagged by the community. Click here to show it
          identicon
          Anonymous Coward, 26 Oct 2017 @ 4:09pm

          Re: Re: Re: If they can't manage their platforms...

          I know you won't GTFO even though it is "so hard". because you are a greedy POS.

          link to this | view in chronology ]

        • identicon
          Anonymous Coward, 26 Oct 2017 @ 4:13pm

          Re: Re: Re: If they can't manage their platforms...

          What you typed makes no sense.

          link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 26 Oct 2017 @ 4:34pm

          Re: Re: Re: If they can't manage their platforms...

          The point is that unless they run a tiny site that only a handful of people use every day, a sysadmin will eventually lose, whether to scale or to time, the ability to know “everything” about their system. How many sysadmins do you think YouTube has?

          link to this | view in chronology ]

      • identicon
        Anonymous Coward, 27 Oct 2017 @ 7:30am

        Re: Re: If they can't manage their platforms...

        I've been a sysadmin for close to 40 years, and have designed, built and run all shapes/sizes of operations. If one of mine was as horribly run as Twitter et.al., I'd pull the plug on it immediately rather than endure the continued embarrassment of being responsible for it.

        Techniques for controlling abuse have been VERY well-known for decades. They work. But only if they're used in a competent and diligent fashion. No, they're not a panacea -- nothing is -- but they work well enough that what gets past them can be dealt with on a case-by-case basis.

        Moreover, abuse control actually gets EASIER in large operations -- if done properly. So the "but scale!" excuse is not only whining bullshit from losers, it reflects a failure to learn some fundamental lessons and apply them.

        The problem is not that this task is hard. The problem is the combined ignorance and arrogance of the losers running these sites: they'd rather throw up their hands and abdicate responsibility than actually do the work that's required.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 27 Oct 2017 @ 8:27am

          Re: Re: Re: If they can't manage their platforms...

          "Techniques for controlling abuse have been VERY well-known for decades. "

          - and those techniques are: (enter technique here)


          "abuse control actually gets EASIER in large operations -- if done properly."

          - and done properly means: (enter what you mean here)
          - and scales well: (enter scaling metrics here)


          "The problem is not that this task is hard."

          It is apparent that you have never attempted to do what you claim is easy. If you actually have knowledge on the subject then why be so coy about it?

          link to this | view in chronology ]

        • icon
          MyNameHere (profile), 29 Oct 2017 @ 3:50am

          Re: Re: Re: If they can't manage their platforms...

          The problem for Twitter is actually pretty simple: they have a business model problem.

          What is the problem? Well, quite simply, even with the most simple, most automated operation, they don't make money. Earlier this year, they were proud to announce they only lost 62 million in a quarter, which was a big improvement over their previous results.

          So Twitter is facing the horrible problem of dealing with policing their site, while realizing that it costs them both expense and users.

          Essential, as a wild west, the business is socially interesting but a bottom line failure. They have no desire to make it worse be banning people willy nilly and dropping posts.

          Now on the other side, you have Instagram. It's a bit of a cesspool as well, but post up a picture with naked bits on it, and the image disappears pretty quick. Do it a couple of times, and your account disappears pretty quick too. They are taking care of business.

          Most sites simply don't have the money to address the problem.

          link to this | view in chronology ]

    • icon
      orbitalinsertion (profile), 26 Oct 2017 @ 3:22pm

      Re: If they can't manage their platforms...

      I would agree that the "new IT" is not run by classical sysops, much to their detriment. But none of them ever had to police individual-created content at any such scale. No one can, and it is very arguable that no one should, do that. This has nothing to do with what their systems are doing. Bots and spam are hardly the issues in question (but yes they could do better with those sometimes).

      I have never been overly fond of most of the platforms in question, but not policing user content is so far at the bottom of my list of reasons (and mere preferences) that it fell off before it was ever on the list.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Oct 2017 @ 4:08pm

      Re: If they can't manage their platforms...

      There is some truth to what you say. Always half ass technologies that don't work as intended.

      It is all profit profit profit and then we see how can we fix all the externalities we have created, or just simply let others fix those.

      link to this | view in chronology ]

    • icon
      That One Guy (profile), 26 Oct 2017 @ 4:53pm

      If you can't manage your roads...

      ... then you should shut them down.

      Responsible, ethical, competent, diligent politicians and police know EVERYTHING that people do on their roads. They design for that, they build for that, they operate for that. They simply don't allow for speeders or people changing lanes without using blinkers and all the other myriad forms of abuse. (And if those things happen anyway? They're quickly crushed out of existence and changes are made to avoid a repeat.)

      Yes, of course, the internet would be much better off without any platform for user submitted content that are actually useful. That makes perfect sense, and I thank you for opening my eyes to the brilliance of your position through your calm and articulate argument.

      I do have a small suggestion however. Just to save time and effort, perhaps replace the title with 'Nerd harder' next time so that people understand that your comment is coming from someone with extensive expertise and experience in the field and give it all due respect from the get-go.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 26 Oct 2017 @ 3:11pm

    "Forcing Internet Platforms To Police Content Will Never Work" -- Even if were true, it's FUN!

    Those "Internet Platforms" use easily gotten money to control folk, keep money offshore through tax dodges "legalized" by politicians, track us all over without least permission nor way to stop them, are directly connected to intelligence agencies according to Snowden, so ANY measures that cause them to be slowed and hampered is FINE with me.

    Besides, your alternative is that mega-corporations get tens of billions without ANY responsibility to the societies that give them permission to exist in the public's marketplace.

    link to this | view in chronology ]

    • This comment has been flagged by the community. Click here to show it
      identicon
      Anonymous Coward, 26 Oct 2017 @ 3:12pm

      Re: "Forcing Internet Platforms To Police Content Will Never Work" -- Even if were true, it's FUN!

      Corporations are legal fictions, cannot suffer harm except to have income reduced, have NO rights and are NOT persons except to lawyers, romneys, and masnicks.

      Masnick here is almost explicitly for corporations as royalty, above all laws that mere "natural" persons must obey.

      And of course this is only second-hand assertion of "laws". Any fool can make up laws. Here's one off top of my head: If let corporations and lawyers run wild, Nazi-ism results where corporations murder people for profit, and it'll all be "legal". -- Even if not lawful or moral: corporations have no soul or empathy, nor any other frailty of "natural" persons. You are nothing but potential profit -- ANY way that can be "legalized".

      link to this | view in chronology ]

      • icon
        Stephen T. Stone (profile), 26 Oct 2017 @ 3:45pm

        Yeesh.

        Any fool can make up laws.

        Thank you for proving your point.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 26 Oct 2017 @ 4:20pm

          Re: Yeesh.

          But he/she just expressed an opinion, no real laws were enacted ;)

          Yet you instead of debating the ideas simply tried to discredit him/her personally.

          link to this | view in chronology ]

      • identicon
        Anonymous Coward, 26 Oct 2017 @ 4:17pm

        Re: Re: "Forcing Internet Platforms To Police Content Will Never Work" -- Even if were true, it's FUN!

        THIS.

        The problem is Corporation corrupting Government through money aka CORRUPTION. That is the problem.

        But yeah, these days we see many companies and its slaves acting as if they are above the law.

        We need government above any corporation, including banks. We need government to not relate in any form to any corporation and not receive any money from them. Campaign money will be only form taxes at a fixed % of taxes divided equally among all parties. That is it.

        link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Oct 2017 @ 4:12pm

      Re: "Forcing Internet Platforms To Police Content Will Never Work" -- Even if were true, it's FUN!

      Yes, and governments play their part too. enacting biased laws. enforcing where they shouldn't and not where they should, etc.

      I think the problem is not just Government or just Corporation. It is both. The complicity between them aka CORRUPTION.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 27 Oct 2017 @ 12:45am

      Re: "Forcing Internet Platforms To Police Content Will Never Work" -- Even if were true, it's FUN!

      >so ANY measures that cause them to be slowed and hampered is FINE with me.

      So you would not care if most of your comments were never made public, because that is what forcing vetting of comments and posting would do.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 26 Oct 2017 @ 3:14pm

    Only 2 comments in 90 minutes and both opposing!

    You have about hit bottom at least on this topic.

    Even the ardent fanboys seem to have drifted away...

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Oct 2017 @ 6:06pm

      Re: Only 2 comments in 90 minutes and both opposing!

      And yet here you are...still...

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Oct 2017 @ 4:01pm

    Why are other media, such as Film, Newspapers, Magazines, Books and other, able to show any kind of stuff? Anything from the most irreverent crap, to violent/sadistic bs, to racial/derogative shiz, to fake/frauduent/scammy ads, to pr0n/nudity/etc????

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Oct 2017 @ 4:15pm

      Re:

      because ... it's on the internet silly!

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 26 Oct 2017 @ 4:23pm

        Re: Re:

        Films that first appear on theaters? Yet they represent, no they idolize murder, rape, crime, etc? Or Books that prsent you in detail how a crime was committed? Or Music that talked about drugs and beating the whores, long before there was any internet? Or all those scam ads on radio and TV of yesterdecade?

        link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 26 Oct 2017 @ 4:56pm

          Re: Re: Re:

          The Internet is not a content medium. It is a communications protocol, a method of content delivery. Arguing that books, movies, etc. face less scrutiny than the Internet shows an astounding amount of ignorance surrounding that fact. It also shows a similar ignorance of just how much scrutiny that books, movies, etc. have faced over the years from both powerful groups of lay people and far more powerful government institutions.

          link to this | view in chronology ]

    • icon
      That One Guy (profile), 26 Oct 2017 @ 4:45pm

      Re:

      Well you see it started with the decline in society that led to womens dressing indecently, wearing clothing such that people could not only see their ankles but their legs, which of course inflamed the minds of all good, moral men such that sex and porn was all they ever thought about from dawn to dusk, and it just went downhill from there with the drinking and the rocking and the rolling.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 27 Oct 2017 @ 2:35am

      Re:

      Do you understand the difference between a publisher, and platform for self publishing?

      With a publisher, and editor looks at works submitted for publication, and decides which ones they will publish.

      A platform on the other hand allows anyone to publish their own works.

      Force platforms to become publishers, and they will act as gate keepers, and only publish a fraction of the works submitted, not because there is anything illegal about the rejected works, but rather because they cannot look at all works, and of those they look at, they will publish those that in their opinion will gain the largest audience.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Oct 2017 @ 5:25pm

    "At a time when the clamor for mandatory content moderation on the internet seems to be reaching a fever pitch, we should be careful what we wish for. It won't work well and the end results may make just about everything worse."

    That's just it Mike, we didn't wish for it. Again, you're just trying to project the illusion that your readers and the public at large are "somehow" behind the decisions that Google makes, so that when they cave into those they're in liege with, with regard to things they patently should not cave into, "somehow" Google is "blameless" because everyone "clamored" for it. No one "clamored" for it. The governmental powers that Google are in liege with, that they pretend not to be in liege with, wanted it, and as usual, Google gave it to them. And also as usual, you back everything they do at some level and always try to present them as innocent roses, simply "giving people what they want", rather than the insidious conniving control freak whores they actually are in real life.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Oct 2017 @ 6:10pm

      Re:

      “Google, My liege, the ISPs are beaten! Only the FCC stands between us and total victory!”

      “Our people clamour for your voice to lead them as usual!”

      Mike in an alternate universe somewhere.

      link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 26 Oct 2017 @ 6:27pm

      Re:

      you back everything they do at some level and always try to present them as innocent roses

      [citation needed]

      link to this | view in chronology ]

  • icon
    stderric (profile), 26 Oct 2017 @ 5:57pm

    You know, watching a man have a romantic dinner with his favorite pair of socks isn't nearly as entertaining as one might expect. Maybe it'll get better after they finish the word salad.

    link to this | view in chronology ]

  • icon
    MyNameHere (profile), 26 Oct 2017 @ 6:20pm

    black or white mentality (not racist either)

    I have a real problem with the all or nothing, black or white mentality in play here. If you can't do everything, clearly you assume you should do nothing. That's defeatist and ignorant.

    Almost every platform does something. Even Techdirt filters out "spam" *cough*. Facebook deletes obviously fake accounts, Twitter bans certain accounts, Google deletes email boxes of spammers and scammers.

    The world is rarely black and white.

    The arrogant attitude of many internet services comes from the idea that to be "free speech" they also have to be willfully blind and ignorant to everything that happens in their place of business, their sites, their domain. Guys like Kim Dotcom have convinced you that total ignorance is a perfect legal defense, even if you have to wear blinders and purposely ignore everything that is going on.

    It's arrogant, and it's morally bankrupt.

    One of the reasons there is such a huge backlash against the Silicon Valley types is that there is a huge amount of arrogant attitudes, an absence of morals, and things that go against the common man. So called "Jerk Tech", also referred to as 1% apps, are set up by ignoring social norms, ignoring the law, and just going for it regardless. When questioned, they fall back on the old "just a service, not responsible" line that infuriates so many.

    We all have a certainly level of self-responsibility. We all have a certain minimum standard in our lives. We would not allow drug dealers, fraudsters, Pedos, murders, and other criminals to operate out of our offices or our homes. Why should we suddenly forget about that because it's on your site, our app, our domain, or our service?

    Nobody is after online sites to do more than the real world. They are pushing to get the online world to respect the norms everyone else does.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Oct 2017 @ 7:03pm

      Re: black or white mentality (not racist either)

      _If you can't do everything, clearly you assume you should do nothing. That's defeatist and ignorant._

      Said Mr. "It's the law is the law is the law is the law".

      _We all have a certainly level of self-responsibility. We all have a certain minimum standard in our lives._

      Unarmed-citizen-shooting-cop-apologist says what?

      link to this | view in chronology ]

    • icon
      That One Guy (profile), 26 Oct 2017 @ 7:42pm

      Re: black or white mentality (not racist either)

      I have a real problem with the all or nothing, black or white mentality in play here. If you can't do everything, clearly you assume you should do nothing. That's defeatist and ignorant.

      If you have a problem with it then you should stop beating up the strawman of it. No-one is saying 'sites shouldn't do anything', the argument is that if they are legally obligated to 'do something' then the harm will vastly outweigh the gains, and when pressured to 'do something' the scope of the problem often means boneheaded 'collateral damage' occurs.

      'How could YouTube -- pressured to stop terrorist content from appearing, take down evidence of war crimes instead? How could Facebook -- pressured to stop harassment and abuse -- leave that content up, but silence those reporting harassment and abuse?'

      The arrogant attitude of many internet services comes from the idea that to be "free speech" they also have to be willfully blind and ignorant to everything that happens in their place of business, their sites, their domain.

      It's not 'willful blindness' to not do what you aren't required to do, or even can do in any feasible fashion. If I told you that I wanted you to record and vet every conversation by everyone on your block in case someone uttered a phrase I found offensive it would not be 'willful blindness' on your part to refuse.

      We all have a certainly level of self-responsibility. We all have a certain minimum standard in our lives. We would not allow drug dealers, fraudsters, Pedos, murders, and other criminals to operate out of our offices or our homes. Why should we suddenly forget about that because it's on your site, our app, our domain, or our service?

      And the minute you can prove that someone is breaking the law you can go after them. A platform/site has no 'self-responsibility' to pro-actively filter and vet everything just because someone might do something bad with their platform/product/service.

      Nobody is after online sites to do more than the real world. They are pushing to get the online world to respect the norms everyone else does.

      Well it's a good thing online services are comparable to offline ones then, such that demanding that they 'respect the norms everyone else does' translates perfectly well. You know, like how laws regarding aviation translate perfectly to people walking on the sidewalk; they're both forms of transportation, so clearly the same laws and norms should apply.

      Online platforms are significantly different than offline publishers, such that what works for one does not work for another in any way that allows it to work. If a site like youtube was forced to vet content like a newspaper was it wouldn't exist. If a service that allowed people to host sites was held accountable for the content that someone might post then that wouldn't exist.

      link to this | view in chronology ]

      • icon
        MyNameHere (profile), 27 Oct 2017 @ 5:15am

        Re: Re: black or white mentality (not racist either)

        "If you have a problem with it then you should stop beating up the strawman of it. No-one is saying 'sites shouldn't do anything', the argument is that if they are legally obligated to 'do something' then the harm will vastly outweigh the gains, and when pressured to 'do something' the scope of the problem often means boneheaded 'collateral damage' occurs."

        Actually, the argument that has been raised is that if they do anything, then they could be liable for everything, and thus will choose to do nothing. Argument most commonly used in reference to SESTA.

        "It's not 'willful blindness' to not do what you aren't required to do, or even can do in any feasible fashion. If I told you that I wanted you to record and vet every conversation by everyone on your block in case someone uttered a phrase I found offensive it would not be 'willful blindness' on your part to refuse."

        Your example doesn't work. Public conversations are not published by third parties - they are public and are can be controlled only by the person making them. Published on a website, there is an element of potential control that enters into things. The website can choose not to host the "speech", even if it's free.

        "And the minute you can prove that someone is breaking the law you can go after them."

        That would be way too high of a standard. The proof would require successful prosecution, and potentially all appeals completed before it would be entirely official. Since that process can often take years, are you willing to allow all illegal content (say Pedo stuff) stay online until a prosecution is completed? That answer clearly is no, so then after that, it's just a question of where in the murky sands you care to draw your personal line.

        "Well it's a good thing online services are comparable to offline ones then, such that demanding that they 'respect the norms everyone else does' translates perfectly well. You know, like how laws regarding aviation translate perfectly to people walking on the sidewalk; they're both forms of transportation, so clearly the same laws and norms should apply."

        Nice try, but analogy fail. We are talking speech versus speech here, not flying. In the real world, a magazine would not print pedo pictures. Yet, your view seems to be that until the person is prosecuted, publishing them online is fine. How weird is that? Why is the online world given a free pass that doesn't happen anywhere else?

        "If a site like youtube was forced to vet content like a newspaper was it wouldn't exist. "

        Actually, newspapers existed and continued to exist exactly because they did vet their content. The question of volume is one of business models. Are you willing to trade societal norms and responsibility for volume? That seems like a very poor trade.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 27 Oct 2017 @ 5:23am

          Re: Re: Re: black or white mentality (not racist either)

          >Published on a website, there is an element of potential control that enters into things.

          Just like governments make the roads, and issue driving licenses, and therefore have an element of control, and so are responsible for accidents caused by speeding and dangerous driving, because they did not stop it.

          link to this | view in chronology ]

        • icon
          That One Guy (profile), 27 Oct 2017 @ 6:52pm

          Re: Re: Re: black or white mentality (not racist either)

          Actually, the argument that has been raised is that if they do anything, then they could be liable for everything, and thus will choose to do nothing. Argument most commonly used in reference to SESTA.

          Because the law would penalize 'knowing' that illegal content was on the service, such that it would be safer to not look at any of it and keep an entirely hands-off approach. That's not a strawman, it's a result of a stupid law with lousy wording.

          Your example doesn't work. Public conversations are not published by third parties - they are public and are can be controlled only by the person making them. Published on a website, there is an element of potential control that enters into things. The website can choose not to host the "speech", even if it's free.

          In the sense that they can refuse to host any speech, sure, they have 'an element of control'. For a platform like youtube the 'control' is minimal unless they start pre-vetting everything, and given how much content that would be it would all but destroy the platform entirely.

          That would be way too high of a standard. The proof would require successful prosecution, and potentially all appeals completed before it would be entirely official. Since that process can often take years, are you willing to allow all illegal content (say Pedo stuff) stay online until a prosecution is completed? That answer clearly is no, so then after that, it's just a question of where in the murky sands you care to draw your personal line.

          ... And of course you go with child porn for your example, the universally illegal content which no site even remotely legal knowingly hosts. Classy.

          With regards to content which is not blatantly illegal, like say speech that may or may not be defamatory or infringing, then yes, a court finding should be required, because it should not be easy to remove speech.

          Nice try, but analogy fail. We are talking speech versus speech here, not flying. In the real world, a magazine would not print pedo pictures. Yet, your view seems to be that until the person is prosecuted, publishing them online is fine. How weird is that? Why is the online world given a free pass that doesn't happen anywhere else?

          Continuing on the child porn argument and the flawed comparison between publishers and platforms, you are on a roll.

          The 'online world' isn't given a pass, it operates under slightly different rules because, and this may surprise you, things are different.

          A newspaper chooses what they publish, and can easily pre-vet the content they publish as the scope is dramatically smaller and they don't post content in near-real time, giving them time to do so.

          A platform on the other hand merely offers a way for others to decide what to publish, and typically deals with such a massive amount of content that it would be effectively impossible for them to pre-vet anything where there isn't a clear and easily verifiable way to check the legality of a particular content. A filter like that is possible for 'no context required' content like your favorite example, and I believe actually does exist, but for content where the legality if not instantly obvious it would require manual review which isn't feasible because of scope, and would involve removal of speech/content which had not been found illegal, simply because it might be.

          Actually, newspapers existed and continued to exist exactly because they did vet their content. The question of volume is one of business models.

          No, it's not, because Youtube's business model is not the same as a newspaper's, and it does not include 'pre-vet content posted' like a newspaper does. Their business model is to act as a platform rather than a publisher, such that volume is not a 'business model' problem until it reaches a point where they can't handle(not vet mind) the amount of content being posted.

          Are you willing to trade societal norms and responsibility for volume? That seems like a very poor trade.

          Let's see, massive amounts of content and creativity and speech flourishing like never before with people having access to numerous platforms, 'new' business models like patronage allowing more people than ever before to make if not a living then a modest amount from their creativity, creativity that wouldn't have been able to flourish under a more locked down, pre-vetted gatekeeper model in exchange for... what exactly is being 'traded off' again? Some people who use the platforms available in bad or even illegal ways?

          Yeah, I'll take that 'trade' any day.

          link to this | view in chronology ]

          • icon
            MyNameHere (profile), 28 Oct 2017 @ 2:30am

            Re: Re: Re: Re: black or white mentality (not racist either)

            You sort of lost the plot, you are too focused on the what "child pr0n is obvious!" and worried about that, rather than worrying about the concept.

            Without a conviction, how do you really know it's CP, or illegal in any manner? If a site is willing to make that choice without waiting for a conviction, where does the line get drawn? It ends up being a discussion on a level that is meaningless in a legal context, because it becomes how you feel about something. Is the KKK posting hateful massages illegal, or just hateful? Does a site like Twitter remove it because they have a conviction in hand, or just because they don't want hateful messages on their site?

            "A newspaper chooses what they publish, and can easily pre-vet the content they publish as the scope is dramatically smaller and they don't post content in near-real time, giving them time to do so."

            A website has the same option. They choose to forego that option and instead publish everything as fast as they can. It's a choice. A newspaper could decide to just print everything they get verbatim without consideration. Would they be more or less liable?

            Like it or not, "platform" owners do have control over what appears on their sites. Youtube as an example groups things together, allows comments, suggests related materials, helps to curate play lists and suggests more content - and then arranged it on pages with relevant advertising and such. Clearly, it's more like a newspaper (publication) and less like a printing press. They are your partners in content distribution.

            "massive amounts of content and creativity and speech flourishing like never before with people having access to numerous platforms,"

            In no small part because of an absence of liability.

            "what exactly is being 'traded off' again? Some people who use the platforms available in bad or even illegal ways?"

            Actually, what is being traded is responsibility. The internet "revolution" is in many ways no different from the hippies. They wanted a world where you could do your own thing and nobody bothered you. It doesn't work. Your actions, your words, and your attitudes affect others around you, like it or not. What you are suggesting is that "platforms" have no responsibility for what they profit from, but equally have no responsibility to know where the material comes from. It creates a liability gap which is untenable. What you traded away was decorum and respect. What you get instead is sort of an online version of street gangs. How nice is that?

            link to this | view in chronology ]

            • identicon
              Anonymous Coward, 28 Oct 2017 @ 8:36am

              Re: Re: Re: Re: Re: black or white mentality (not racist either)

              Decorum and respect? The RIAA lost that plot a long, long time ago, harassing people for settlements.

              link to this | view in chronology ]

            • icon
              That One Guy (profile), 28 Oct 2017 @ 6:30pm

              Re: Re: Re: Re: Re: black or white mentality (not racist either)

              It ends up being a discussion on a level that is meaningless in a legal context, because it becomes how you feel about something. Is the KKK posting hateful massages illegal, or just hateful? Does a site like Twitter remove it because they have a conviction in hand, or just because they don't want hateful messages on their site?

              So long as they're just being hateful assholes then it's perfectly legal for them to be so. Whether or not Twitter wants to offer them a platform to do so is their business, and should not carry a legal penalty if they decide that they don't want to get involved and stay neutral.

              The thing about making legal penalties for platforms is that it will chill speech. You may object to someone else's speech and think it shouldn't be allowed, but search long enough and it's pretty much a given that someone else will object to your speech and want it removed. It is both naive and arrogant to think that putting in place penalties for 'wrong' speech will never affect you, or that even if it doesn't all the speech that gets removed as a consequence wasn't that important and was acceptable collateral damage.

              A website has the same option. They choose to forego that option and instead publish everything as fast as they can. It's a choice. A newspaper could decide to just print everything they get verbatim without consideration. Would they be more or less liable?

              No, they host content, which differs from a newspaper which publishes. One involved a company providing a platform for others to post content, the other involves a company choosing which content they want to publish, whether it be their own or content that has been submitted to them. Your continued conflation of the two is not doing you any favors, and at this point I can only assume it's deliberate.

              As for 'they have that option', no, not if they want to be of any real use for more than a handful of people. Youtube deals with hours of content on a minute basis, if manual review was required they would be forced to shut down overnight.

              Dig past the surface of the lofty 'social responsibility and obligations' rhetoric and what you and those arguing for legal obligations involving manual review are actually arguing for is for open platforms such as youtube to shut down and simply cease to exist. That any platform that cannot pre-vet the content posted to it, and magically know what is and is not illegal should not exist.

              As I said before, that is not a cost I'd find acceptable, nor a trade I'd be willing to make, as society, culture and creativity would be the poorer because of it.

              Like it or not, "platform" owners do have control over what appears on their sites. Youtube as an example groups things together, allows comments, suggests related materials, helps to curate play lists and suggests more content - and then arranged it on pages with relevant advertising and such. Clearly, it's more like a newspaper (publication) and less like a printing press.

              And were any of that not automated, you might have had a point, but I'm fairly certain that most if not all of that is done by automated programming which looks for certain patterns and/or flags and acts accordingly, or allows the users to use certain tools involving the content like creating playlists or leaving comments(the latter of which I have no idea why you included, unless you're arguing that allowing user comments results in site liability, which is flat out wrong).

              'Video 1, 34, and 75 involve 'gaming'. Viewer A has watched these videos. Videos 34, 106 and 347 also involve 'gaming', therefore they go on the 'Recommended' list for Viewer A.' At no point does someone from Youtube need to step in and make a choice, or involve themselves at all really unless something screws up.

              In no small part because of an absence of liability.

              Yes, how unfortunate that the platforms can't be targeted for the actions of their users, what a terrible shame that is the explosion of creativity, content and speech society has been cursed with as a result.

              What you are suggesting is that "platforms" have no responsibility for what they profit from, but equally have no responsibility to know where the material comes from.

              Unless they are the ones creating the content, correct. The most they can realistically do, and which they already do, is to have the one posting it confirm that as far as they know they have the legal right to post it, and it isn't breaking any laws.

              What you traded away was decorum and respect.

              Yes, those dang kids with their loud music and crazy clothes, no decorum or class at all!

              As for 'respect', somehow I think, though it may be really hard, that I can do without the respect of those that would stifle platforms enabling massive amounts of free speech and content that wouldn't be possible without them, simply because some people might use them in the 'wrong' way.

              What you get instead is sort of an online version of street gangs. How nice is that?

              As an assertion, terrible. As far as reality goes those 'gangs' seem to be just like any other large group of people, with people all across the spectrum, some good, some bad, but most mediocre.

              link to this | view in chronology ]

              • icon
                MyNameHere (profile), 29 Oct 2017 @ 1:48pm

                Re: Re: Re: Re: Re: Re: black or white mentality (not racist either)

                "No, they host content, "

                This is where we disagree.

                Hosting content would be that someone puts something up (a web page, an image file, whatever) and it is presented in the same format if you ask for the correct URL. Facebook, Twitter, YouTube - they don't really do that. They aggregate and publish content (like a newspaper) from different sources. They (and they alone) get to choose how it appears, if it appears, where it appears, and when it appears. Your Twitter feed allows you only to choose from who you want stories, and even then Twitter injects "suggested stories" or "sponsored tweets" into that. They decide everything from the font to the order, they mute writers they don't like, they promote others, they choose who is in your "in case you missed it" part of your page when you get there.

                You have to consider that a post on facebook is no different from a "wire story" that a newspaper gets. The newspaper still decides what page it goes on, what font size it has, if they include the picture or not, and so on. Facebook has that same control over the posts you make. They choose when and where they appear, the order, the font, appearance, position... or for that matter if it is seen at all. Consider their most recent changes in regards to your "feed". A small change has dropped traffic to certain sites and pages off the roof. The users didn't decide that, Facebook did.

                So I don't feel these sites are just hosts. They are publishers. They abdicate responsibility to check what they publish - and that's the problem.

                link to this | view in chronology ]

                • icon
                  That One Guy (profile), 29 Oct 2017 @ 6:33pm

                  Re: Re: Re: Re: Re: Re: Re: black or white mentality (not racist either)

                  Hosting content would be that someone puts something up (a web page, an image file, whatever) and it is presented in the same format if you ask for the correct URL. Facebook, Twitter, YouTube - they don't really do that.

                  Were that the case then unless I am completely misreading you the only people who would be able to post content would be those with the knowledge of how to write code. Want to write something up for your personal blog and/or post a picture? Better know all the HTML/markdown coding to get the page looking like you want it to. Don't know how? Tough, no page for you.

                  That a platform like youtube chooses which font to use and the page layout does not mean they are choosing anything meaningful as far as content goes. Which videos get posted are (almost) entirely on the users, with youtube's involvement being nearly all automated and involving allowing someone to easily post something and having it appear in a usable manner. Saying they are 'choosing' when/how/what appears is like saying that a soda machine 'chooses' which soda will appear when you push the button, it's all automated.

                  So I don't feel these sites are just hosts. They are publishers. They abdicate responsibility to check what they publish - and that's the problem.

                  Yes, how terrible that they aren't doing what is both not legally obligated of them or even remotely feasible. That's not a 'problem', it's what allows them to function in any meaningful way.

                  The bar you're setting for 'publisher' is insanely low such that pretty much no platform would qualify, and if anyone wanted to post something they would need to learn coding to do so. To say that standard would have serious repercussions on free speech, culture and creativity would be a serious understatement, and is not one I see as reasonable or viable.

                  link to this | view in chronology ]

                  • icon
                    MyNameHere (profile), 31 Oct 2017 @ 8:33am

                    Re: Re: Re: Re: Re: Re: Re: Re: black or white mentality (not racist either)

                    "Were that the case then unless I am completely misreading you the only people who would be able to post content would be those with the knowledge of how to write code. Want to write something up for your personal blog and/or post a picture? Better know all the HTML/markdown coding to get the page looking like you want it to. Don't know how? Tough, no page for you."

                    Strawman.

                    Anyone (and I do mean anyone) can open their own website. Products such as Wordpress make it really easy to operate your own blog, "wordpress hosting" means you can be up and running in minutes, and you need to know very little or no HTML or other coding to do so. Wordpress is a 5 minute install.

                    "That a platform like youtube chooses which font to use and the page layout does not mean they are choosing anything meaningful as far as content goes. "

                    The point is that Youtube offers more than just hosting - and it's not particularly optional either. To post a video, you need to have an account. Your video will always appears Youtube branded (in their player) and will always have links to youtube as a result. Your video page will never be the video alone, there will always be other things added by youtube, from advertising and "recommended videos" to discussion about your video, friends lists, etc.

                    You control none of it. Youtube does. You get only to submit the video. You don't get to control it after that, except to remove it. Youtube decides when and where it appears, how it is promoted on their site, and so on.

                    In fact, when it comes to look and feel, operations, and public interaction, you are nothing more than a content provider for their publishing network. They control everything past the point of creation.

                    "Yes, how terrible that they aren't doing what is both not legally obligated of them or even remotely feasible. That's not a 'problem', it's what allows them to function in any meaningful way."

                    Legally, they do have certain obligations (kiddy porn is the usual boogie man). They have certain things they cannot have on their site, no matter how "innocent host" they want to be.

                    As for "remotely feasible" all I can say is that is very much and exactly what I consider a "business model" problem. If being legal and within the law isn't "remotely feasible" then you have a problem with your whole business setup.

                    It cuts both ways.

                    link to this | view in chronology ]

                • icon
                  Mike Masnick (profile), 30 Oct 2017 @ 12:30am

                  Re: Re: Re: Re: Re: Re: Re: black or white mentality (not racist either)

                  So I don't feel these sites are just hosts. They are publishers. They abdicate responsibility to check what they publish - and that's the problem.

                  Hi. Hello. Can you point to a single court that has agreed with this inane interpretation of the law, or can we agree that you're just making up shit whole cloth again?

                  link to this | view in chronology ]

                  • icon
                    MyNameHere (profile), 31 Oct 2017 @ 2:19am

                    Re: Re: Re: Re: Re: Re: Re: Re: black or white mentality (not racist either)

                    For the moment, no. DMCA created a rather large hole that pretty much everyone and their dog has driven through.

                    However, it's changing slowly. EU copyright reforms look like they are going to strip away at least some of the protections, they seem to have a better understand of the difference between a neutral hosting company and an online content aggregation publisher.

                    Of course, maybe FT was making it up from whole cloth? https://www.ft.com/content/7dec4252-7a85-11e6-ae24-f193b105145e

                    link to this | view in chronology ]

              • icon
                The Wanderer (profile), 30 Oct 2017 @ 4:42am

                Re: Re: Re: Re: Re: Re: black or white mentality (not racist either)

                No, they host content, which differs from a newspaper which publishes. One involved a company providing a platform for others to post content, the other involves a company choosing which content they want to publish, whether it be their own or content that has been submitted to them. Your continued conflation of the two is not doing you any favors, and at this point I can only assume it's deliberate.

                The way I read it (by incorporating his mentions of "business model" in these threads), his argument is that:

                • Regardless of whether or not YouTube's business model is "provide a platform for other people to post content", it still has the same obligations to society as did any pre-Internet middleman.

                • One of those obligations is to filter the content made available, so that particular (albeit not fixed) bad things are not made available.

                • If it is impossible to fulfill these obligations under the chosen business model, that is the company's problem for having chosen this business model; it must either find a way to do that impossible thing, or shut down.

                His

                The question of volume is one of business models. Are you willing to trade societal norms and responsibility for volume? That seems like a very poor trade.

                seems like another way of expressing this same view, and you've already responded to that one.

                link to this | view in chronology ]

                • icon
                  MyNameHere (profile), 31 Oct 2017 @ 8:25am

                  Re: Re: Re: Re: Re: Re: Re: black or white mentality (not racist either)

                  Actually, no.

                  "Regardless of whether or not YouTube's business model is "provide a platform for other people to post content", it still has the same obligations to society as did any pre-Internet middleman.
                  "

                  Does Youtube only provide hosting for videos, or do they aggregate them, publish them on pages, insert ads, track favorites, make recommendations, and offer things like accounts, favorite lists, and the like? Hosting alone would be "here is the video", Youtube offers up much more.

                  "One of those obligations is to filter the content made available, so that particular (albeit not fixed) bad things are not made available."

                  I don't think that filtering for bad things is the point. Filtering for illegal things should be. They are already fast for kiddy porn or a stray nipple for that matter, so taking a bit more care isn't out of the reach of what they do. When it came time to filter out stuff that would hurt them making ad dollars, they were pretty quick about it, right?

                  "If it is impossible to fulfill these obligations under the chosen business model, that is the company's problem for having chosen this business model; it must either find a way to do that impossible thing, or shut down."

                  If my business model is selling Ice Cream cones for 50% less than they cost to produce, I have a bad business model. If I operate a website that cannot generate as much income as it costs to run, it seems pretty logical that it too has a bad business model.

                  In the end, it's pretty simple: If your business model only works because you don't care about what your are publishing on your website(s) then you have a problem. More so if it is the only way that it is even marginally close to making money. If actually having to give a crap about what you are publishing, promoting, aggregating, and re-distributing makes your business model fail, then perhaps it's better that way.

                  link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Oct 2017 @ 7:30pm

    It's not about forcing internet platforms to police contect. It's about having something to attack internet platforms that the police/government don't like.

    link to this | view in chronology ]

  • icon
    DannyB (profile), 27 Oct 2017 @ 6:15am

    It DOES Work!

    Forcing Internet Platforms To Police Content Will Never Work

    It DOES work. For certain values of 'work'. It just doesn't work in the way we would like. Think about just how it actually works and what it accomplishes. Quick automatic takedowns of anything, anywhere, anytime -- with no consequences or penalty of perjury. Even if those takedowns censor breaking news or have economic consequences. Drive by takedowns.

    If Hollywood is so smart, why can't it build a single centralized website that lets any visitor fire off large scale automated DMCA takedowns to a list of sites offered as checkboxes that can be selected.

    How can you even say: Forcing Internet Platforms To Police Content Will Never Work

    It DOES work. For Hollywood to censor and vandalize the internet. Which is what they want.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 27 Oct 2017 @ 1:36pm

    This is like forcing people to do the job of the police.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.