Facebook And Google Finally Take First Steps On Road To Transparency About Content Moderation

from the but-more-work-is-needed dept

As internet platforms are aggressively expanding their “moderation” of problematic content in response to increased pressure from policymakers and the public, how can we best hold them accountable and make sure that these private censorship regimes are fair, proportionate, accurate and unbiased?

As we wrote in our last piece for Techdirt at the beginning of the year, right before the first Content Moderation and Removal at Scale Conference in Santa Clara, there is a dire need for meaningful transparency and accountability around content moderation efforts in order to ensure that the new rulers of our virtual public squares–practically governments in their own right, with billions of citizens–are using their power to moderate speech responsibly. This need has only grown as the pressure on Facebookistan and Googledom to deal with the extremists, white supremacists, and fake news operations on their platforms has also grown, and as questions about whether they are abusing their power by not taking down enough content–or by taking down too much–have proliferated.

This trend was most evident in the recent Congressional hearings prompted by the Cambridge Analytica scandal, where some lawmakers rebuked Facebook CEO Mark Zuckerberg for not doing enough to keep certain content off the platform, while others raised concerns that Facebook had demonstrated political bias against the right when determining what content to take down. Similar concerns were voiced by Republicans at today’s hearing in the House Judiciary Committee focused on examining major internet platforms’ content moderation practices (despite the fact that claims of anti-conservative bias having been thoroughly debunked). Such concerns are not limited to the right wing, though–charges of racially-biased censorship have also been levelled from the left.

In response to these growing pressures–and in no small part thanks to years of consistent demands from free expression advocates–Google and Facebook this week both took major strides towards “doing the right thing” and promoting greater transparency around their content moderation practices, in ways that mirror what we were advocating for in our previous article.

First, on Monday afternoon, Google released the industry’s first detailed transparency report focused on content moderation, giving statistics about YouTube content removals based on violation of the service’s Community Guidelines. Among other things, the report highlights the total number of videos removed in the last quarter of 2017 (a staggering 8,284,039 videos), the percentage of videos flagged by human users versus YouTube’s automated flagging systems (the robots flagged four times as many videos as the humans), and a percentage breakdown of the different reasons human flaggers had flagged content (whether it was spam, sexual content, hate speech, terrorist content, etc.) This is the first time any company has published this sort of data at this level of detail–and now that YouTube has taken the first step, it certainly won’t be the last.

Soon after YouTube’s trailblazing transparency report, on Tuesday morning, Facebook made a trailblazing announcement of its own. The company published a much more comprehensive version of its Community Standards, including the detailed internal guidelines the company uses to make moderation decisions, and highlighting the “spirit” of their content policies in order to generate greater understanding about why and how the company removes content. In addition, for the first time, the company is giving users the ability to appeal takedown decisions made on individual posts. Posts that are appealed will be reviewed by a human moderator on the company’s appeals team within 24 hours. Prior to this announcement, users could appeal the removal of pages and groups, but the introduction of this process for individual posts is a valuable step towards providing users with greater agency over their content and more engagement in the moderation process.

Taken together, these moves have sharply increased both the quantitative transparency (Google’s numbers) and the qualitative transparency (Facebook’s explanations) around content takedowns, while also improving due process around those takedowns (Facebook’s new appeals). These are both critical first steps, but there is definitely more to be done. For example, although YouTube published a significant amount of data related to the types of objectionable content removed as a result of human flaggers, it does not produce similar data for content flagged by automated flagging systems, which is especially concerning since automated systems flagged the vast majority of objectionable content. Meanwhile, although Facebook’s introduction of an appeals process is a valuable step towards providing users with stronger due process, it currently only applies to hate speech, graphic violence, and nudity/sexual activity, which have been the most controversial categories of objectionable content. In order for this process to be truly impactful, it needs to apply to all forms of content that are being taken down–and the process needs to give impacted users a way to argue their case for why their content should stay up.

Going forward, Facebook and Google also need to take a page out of each other’s books. Like Google, Facebook needs to start reporting quantitative data on its takedowns and how they have impacted different categories of objectionable content, not only for itself but for its other products like Whatsapp and Instagram. Similarly, Google needs to provide users with greater qualitative insight into the guidelines that impact content takedowns, just as Facebook has. They should also expand their takedown reporting to include other Google products and services such as Google+ and the Google Play store. Doing so could help pressure Apple to similarly report on takedowns in the Apple Store, therefore further expanding transparency reporting in this space.

And that’s the real value of these new steps, beyond the transparency itself: Google and Facebook’s new efforts will hopefully push the rest of the industry to compete with them on transparency. Google’s first innovations around transparency reporting on government surveillance demands nearly a decade ago helped set the stage for a domino effect of widespread adoption once the Snowden surveillance scandal broke, as detailed in this timeline and case study on the spread of that reporting practice. In this political moment of “techlash” that has now been turbo-charged by the Cambridge Analytica scandal, the adoption of strong content moderation transparency practices may happen even faster–but only if policymakers and advocates keep demanding it. That includes voices that have been pressing on this issue for years such as the ACLU of Northern California, the Electronic Frontier Foundation, our own organization the Open Technology Institute, and the Ranking Digital Rights project (which just yesterday released its third annual ranking of how well tech companies’ are protecting users’ human rights. Spoiler alert: they’re not doing so great). And since we’re catching this practice at its beginning, perhaps with the right pressure we can not only get all the companies to issue reports but also get them to standardize their reporting formats. Otherwise we may end up with the same crazy quilt of formats that we have in other areas of transparency reporting, which makes it that much harder to meaningfully compare and combine data.

More than pressure, though, we’ll also need continued dialogue with the companies, to better understand how their content moderation and reporting processes do and don’t work, what their biggest challenges are when moderating at scale, and where they think the technology and practice of content moderation and reporting is heading. That’s why our organization along with many others is co-hosting the second Content Moderation at Scale Conference in Washington, DC on May 7, where representatives from a wide range of tech companies both big and small will be talking in detail and on the record about their internal content moderation processes (the conference will be livestreamed and Techdirt's Mike Masnick will be co-running a session on some of the challenges of content moderation).

We may see even more dominoes fall at that conference, with fresh new announcements about increased transparency and due process around content moderation on even more platforms. Let’s hope so, because internet users deserve to know more about exactly when and how their online expression is censored.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: appeals process, content moderation, due process, takedowns, transparency


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 26 Apr 2018 @ 8:39pm

    The Ma_snick hates "Neutral Public Forums". Not even a mention.

    Up SEVEN hours without comment, so I generously give fanboys a target for ad hom.

    As for topic coverage: yet again, another conference of leftists weenies talking only to other leftist weenies, citing leftist "Media Matters" as unable to find any bias against conservatives... Sheesh.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 28 Apr 2018 @ 2:50am

      Re:

      Wasn't SESTA and FOSTA supposed to magically make all these problems go away?

      You mean they haven't?

      Then why didn't your lawmakers just "legal harder"?

      link to this | view in chronology ]

  • icon
    Richard (profile), 27 Apr 2018 @ 3:32am

    Well it's a start

    Well it does seem to be a start in the right direction.

    Whilst I actually agree with our resident troll that the bias against the right is not really fully debunked by that link*, I would like to point out that in earlier times and in other places where the MSM has been controlled mainly by the right it never made any real effort to be even handed either.

    Both sides complain like crazy when they perceive that the other one is making an unfair use of power .... and then behave exactly the same when the roles are reversed.



    *It's probably not a bias against the right as such - but rather the casual enforcement of an accepted wisdom on certain specific issues. These issues shouldn't really be left-right ones at all - but it just so happens that the people who have picked up on them are mostly on the right.

    link to this | view in chronology ]

    • identicon
      Wendy Cockcroft, 27 Apr 2018 @ 6:04am

      Re: Well it's a start

      Agreed. It's the wedge issue divisive tactics on both sides which has led to a situation in which general decency is considered leftist these days. I'm not even joking; altriusm is considered immoral and indeed anti-capitalist in some circles.

      RE: necessary censorship

      This is something I struggle with. Basically, I'm aware that some people WANT to behave badly and rope others into their horrible schemes, which, more often than not, result in exacerbating existing tensions and can and do result in violence. Okay, but what do you do about it? It's effectively a game of whack-a-mole with bad actors frantically spreading rumours and lies about target groups and the platform trying to differentiate between actual news stories and fake news.

      If people would just behave themselves it wouldn't be an issue but since they don't want to what do we do? Nanny them? Exert control? That just results in collateral damage and drives the problem underground. I'm not sure what the answer is but I sure as hell have a lot of questions. It's good that the platforms are making an effort to tackle the problem but honestly I fear that all they'll ever achieve is the effect of putting an ever-growing band aid on an open, festering wound.

      link to this | view in chronology ]

      • icon
        Richard (profile), 27 Apr 2018 @ 9:05am

        Re: Re: Well it's a start

        Yes - your quote about Ayn Rand is quite telling - because in the end, when push came to shove, she failed to live up to her own ideals.

        Personally I have come to increasingly dislike and distrust the right/left spectrum and labels. There is an economic dimension and then there are various social and cultural dimensions. Personally I'm firmly on the left economically - and I deplore the way the right assumes that it has won the argument in that sphere.

        On the other hand I'm dismayed by the way that many on the left have been suckered by some interest groups in the social/cultural arena and often promote things that are quite the opposite of its core values.

        In some places the left/right thing has been very confusing.

        Take France for example. Who was on the left there?

        Le Pen, who wanted to keep the French holidays and pensions - but was demonised as a Fascist or Macron whose tax giveways to the rich probably exceed those of Donald Trump?

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 27 Apr 2018 @ 3:36am

    "how can we best hold them accountable and make sure that these private censorship regimes are fair, proportionate, accurate and unbiased?"

    This is going to be completely impossible and YOU ALREADY KNOW THAT because it will ALWAYS be possible to remove content based upon undocumented under-the-table criteria that is never made known to the public, and of course, nothing is preventing any of these companies from continuing to lie through their teeth about the matter in perpetuity. For example, there will never be a way to show that google/youtube didn't remove a given piece of content simply because they no longer wish to pay anybody their monetary cut even though it has become painfully obvious that is exactly that those creeps are doing through undocumented policy. Getting on person's case for pointing out these simple truths won't change that either.

    link to this | view in chronology ]

  • identicon
    Vivek Jazz, 28 Apr 2018 @ 12:36am

    The amazing article to sharing there post. thank's for this one.
    http://gethelpwindows10.com

    link to this | view in chronology ]

  • identicon
    John, 31 May 2018 @ 12:35pm

    It is difficult to believe that content will be moderated at such a large scale. Good luck with that to Google and Facebook. http://www.iniciarsesionentrar.com

    link to this | view in chronology ]

  • identicon
    irshaad, 11 Oct 2018 @ 12:34am

    fix sound

    Its very amazing post dear.
    http://nosoundwindows10.com/ try this

    link to this | view in chronology ]

  • icon
    zabi78 (profile), 9 Oct 2020 @ 2:44am

    Thanks

    This is a great place with a great community .. glad to have you.
    https://softwaresapp.net/spyhunter-crack-email-and-password/

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.