The Tech Policy Greenhouse is an online symposium where experts tackle the most difficult policy challenges facing innovation and technology today. These are problems that don't have easy solutions, where every decision involves tradeoffs and unintended consequences, so we've gathered a wide variety of voices to help dissect existing policy proposals and better inform new ones.

Infrastructure And Content Moderation: Challenges And Opportunities

from the complexity-is-messy dept

The signs were clear right from the start: at some point, content moderation would inevitably move beyond user-generated platforms down to the infrastructure—the place where services operate the heavy machinery of the Internet and without which user-facing services cannot function. Ever since the often-forgotten incident when Amazon stopped hosting Wikileaks after US political pressure took place in 2010, there has been a steady uneasiness regarding the role infrastructure providers could end up playing in the future of content moderation. 

A glimpse of what this would look like came in 2017, when companies like Clouldflare and GoDaddy took affirmative action against content they considered problematic for their business models, in this case white supremacist websites that had been the subject of massive public opprobrium. Since then, that future has become the present reality as the list of infrastructure companies performing content moderation functions keeps growing. 

Content moderation has two inherent qualities that provide important context. 

First, content moderation is generally complex in real-world process design and implementation. There are a host of conflicting rights, diverse procedural norms and competing interests that come into play every time content is posted on the Internet; each case is unique and on some level so it should be treated. 

Second, content moderation is messy because the world is messy: the global nature of the Internet, economies of scale, societal realities and cultural differences create a multi-layered set of considerations that are difficult to reconcile. 

The bright spot in all this messiness and complexity is the hope of due process and the rule of law. The theory is that, in healthy and competitive markets, users have choice and therefore it becomes more difficult for any mistakes to scale. So, if a user’s post gets deleted on one platform, the user should have the option of posting it someplace else.

Of course, such markets are difficult to accomplish and the current Internet market is certainly not in this category. But, the point here is that it is one thing to have one of your postings removed from Facebook and it is another to go completely offline if Cloudflare stops providing you their services. The stakes are completely different. 

For a long time, infrastructure providers were smart enough to stay out of the content business. The argument was that the actors who are responsible for the pipes of the Internet should not concern themselves with the kind of water that runs through them. Their agnosticism was encouraged because their main focus was to provide other services, including security, network reliability and performance.

However, as the Internet evolved, so did the infrastructure providers’ relationship with content. 

In the early days of content moderation, what constituted infrastructure was more discernible and structured. People would usually refer to the Open System Interconnection (OSI) model as a useful analogy, especially with policy makers who were trying to identify the role and responsibilities various companies held in the Internet ecosystem. 

The Internet of today, however, is very different from those days. The layers of the Internet are not distinguishable any longer and, in many cases, participating actors are not just operating at the infrastructure or the application layers. At the same time, and as applications in the Internet were gaining in popularity and use, innovation started moving upstream. 

“Infrastructure” is now being nested on top of other “infrastructure” all within just layer 7 of the OSI stack. Things are not as clear-cut.

This indicates that, in some ways, we should not be surprised that the content moderation conversations seem to gradually be moving downstream. A cloud provider that provides support to a host of different websites, platforms, news outlets or businesses, will inevitably have to deal with issues of content. 

A content delivery network (CDN) will unquestionably face, at some point, the moral dilemma of providing its services to businesses that walk a tightrope with harmful or even illegal content. It really comes down to a simple equation: if user-generated platforms don’t do their job, infrastructure providers will have to do it for them. And, they do. Increasingly often. 

If this is the reality, the question becomes what is the best way for infrastructure providers to do moderation considering current practices of content moderation, the significant chilling effects, and the often-missed trade-offs. 

If we are to follow the “framework, tools, principles” triad, we should be mindful to not reinvent any existing ecosystem. Content moderation is not new and, over the years, a combination of laws and self-regulatory norms ensures a relatively consistent, predictable and stable environment—at least most of the time. 

Section 230 of the CDA in the US, the eCommerce Directive in Europe, Marco Civil in Brazil and other laws around the world have succeeded in creating a space where users and businesses could manage their affairs effectively and know that judicial authorities would treat their cases equally. 

For content moderation at the infrastructure level, a framework based on certainty and consistency is even more of a priority. Legal theory instructs that lack of consistency can diminish the development of norms or it can undermine the way existing ones can manifest themselves. In a similar vein, lack of certainty means the inability to get organized in such a way that complies with the law. For infrastructure providers that support basic and day-to-day functions of the Internet, such a framework becomes indispensable. 

I often say that the Internet is not a monolith. This is not only to demonstrate how the Internet was never meant to perform one single thing, but also to show the importance of designing a legal framework that behaves the same. When we talk about predictability and certainty, we must be conscious of putting in place requirements of clarity, stability and intelligibility so that participating actors can make calculated and informed decisions about the legal consequences of their actions. That’s what made Section 230 a success for more than two decades.

Frameworks without appropriate tools to implement and assess them, however, mean little. Tools are important as they can help maximize the benefits of processes, ultimately increasing flexibility, reducing complexity, and ensuring clarity. Content moderation has consistently been suffering from lack of tools that could clearly exhibit the effects of moderation. Think, for instance, all these times content is taken down and there is no way to say what the true effect is on free speech and on users.

In this context, we need to think of tools as things that would allow us to better understand the scale and chilling effect that content moderation in the infrastructure causes. Here is what I wrote about this last year

“A starting point is to perform a regulatory impact assessment for the Internet. It is a tested method of policy analysis, intended to assist policy makers in the design, implementation and monitoring of improvements to the regulatory system; it provides the methodology for producing high quality regulation, which can, in turn, allow for sustainable development, market growth and constant innovation. A regulatory impact assessment constitutes a tool that ensures regulation is proportional (appropriate to the size of the problem it seeks to address), targeted (focused and without causing any unintended consequences), predictable (it creates legal certainty), accountable (in terms of actions and outcomes) and, transparent (on how decisions are made).”

In this sense, tools and frameworks are co-dependent. 

Finally, the legitimacy of any framework and of any tools depends on the existence of principles. In content moderation, it is not the lack of principles that is the problem; on the contrary, it is the abundance of them. Big companies have their own Terms of Service (ToS), states operate within their own legal frameworks, and then there is the Internet, which is designed under its own set of principles. Too many principles inevitably create too many conflicts and, therefore, consensus becomes important. 

The Santa Clara principles on transparency and accountability in content moderation have that consensus. Negotiated through a collaborative and inclusive process, they offer a roadmap to content moderation and remove certain obstacles in the process. Their strength lies in their simplicity and straightforwardness. In a similar vein, the properties of the Internet constitute a solid guide in understanding the potential unintended consequences content moderation by infrastructure providers can have. 

The design of the Internet is very specific and adheres to some normative principles that have existed ever since its inception. In fact, the Internet’s blueprint has not changed much since it was sketched a few decades ago. Ensuring that these principles become part of the consideration process is key. 

There are still plenty of unknowns in content moderation at the infrastructure layer. But, there are also quite a few things we do know: the first thing is that scale plays a significant factor. Moderating content down the stack is not just about speech; in many cases, it can be about being present on the Internet. 

The second thing we know is that the general principle of infrastructure actors being allowed to provide their services agnostically of the content they carry should continue to hold as the default. There might be cases when they may need to engage but that should be the exemption rather the rule and it should abide by the “frameworks, tools and principles” identified above. 

Finally, the third thing we know is that content moderation is evolving in ways that could now directly affect the future of the Internet. Ensuring that the role and responsibilities of infrastructure providers is appropriately scoped will be content moderation’s greatest challenge yet! 

Konstantinos Komaitis is a veteran of developing and analysing Internet policy to ensure an open and global Internet. Konstantinos has spent almost ten years in active policy development and strategy as a Senior Director at the Internet Society, and is currently a policy fellow at the Brave New Software Foundation. Before that, he spent 7 years researching and teaching at the university of Strathclyde, Glasgow, UK.

Techdirt and EFF are collaborating on this Techdirt Greenhouse discussion. On October 6th from 9am to noon PT, we'll have many of this series' authors discussing and debating their pieces in front of a live virtual audience (register to attend here).

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: complexity, content moderation, impact assessment, infrastructure


Reader Comments

Subscribe: RSS

View by: Thread


  • identicon
    Anonymous Coward, 4 Oct 2021 @ 1:58pm

    “Infrastructure” is now being nested on top of other “infrastructure” all within just layer 7 of the OSI stack. Things are not as clear-cut.

    This is exactly a thing that needs to be considered here, and the quotes around the word bring this point straight home. I much appreciate the points this article raises here.

    People need to stop abusing words out of convenience or in an attempt to obfuscate.

    Particularly while i have been reading the current Greenhouse, it has become clear to me that far too many bandy about the term "infrastructure" without any qualification. (Kind of the way some use the term "Big Tech", you don't know what they are even talking about until they give an example, and the term was just noise in discourse to begin with).

    So let's do define "infrastructure" and let's get a bit more granular, or... layerular(?) about it. Either label the (sometimes present, sometimes absent) layers, or just be specific. A service being a layer for another service doesn't make it a layer for the entire internet. Infrastructure is the stuff at the bottom, without which nothing works. Wire, fibre, RF, or whatever for physical packet transmission. BGP routers, DNS, low-level protocols. This is infrastructure.

    It was hazy enough for me, so i suspect all the "infrastructure" arguments are just going to further confuse the already confused, ignorant, willfully ignorant, and/or outright cynically malicious politicians, interested actors, and general public.

    link to this | view in chronology ]

  • icon
    sumgai (profile), 4 Oct 2021 @ 9:59pm

    ... infrastructure companies performing content moderation

    Right there, Komaitis has left the reservation. I'm not speaking, as AC does just above, about the definition of infrastructure (though AC is definitely correct in his assertion), I'm speaking about the usual moronic conflation of speech (and moderation thereof) and association.

    No, those companies (Cloudflare et al.) are not moderating content (speech), they are simply choosing with whom they may wish to associate - an entirely different matter altogether. Different enough such that 1A specifically calls out each of them separately - it leaves no doubt in anyone's mind what the Founders intended.

    Look, if I invite you into my home for a discussion, and you start blathering about space lasers controlled by the Jews, I'm gonna ask you to leave in rather short order. Have I just committed "content moderation"? No, I have clearly and forthwithly make a choice regarding with whom I wish to associate. You, the frothing-at-the-mouth conspiracist, are completely free to spout your drivel elsewhere - I haven't denied you any of your rights, I've simply exercised one of mine.

    Mike, I'm sorry, but after 3 green-background articles in a row, I perceive that these so-called experts are not yet ready for prime-time. They have not actually thought through all of the ramifications of what they're allegedly pondering before they make their proposals. And now we see the biggest snafu yet, the conflation I mention in the first paragraph. That's probably worse than "waaaah, make the internet work like I say it should work, waaaah!" These people need a guiding hand, a gentle nudge to look at themselves, and come to understand why simpletons like me can easily pick out the flaws in their proposal. And it ain't the 7+ decades under my belt, either.

    Hopefully, you can provide that guidance, they need it.

    link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 4 Oct 2021 @ 10:10pm

      Re:

      I would disagree (strongly) with the idea that Konstantinos doesn't know what he's talking about. He's making the point that there are a lot more nuances here than simplistic analyses take into account.

      No, those companies (Cloudflare et al.) are not moderating content (speech), they are simply choosing with whom they may wish to associate

      Do you think phone companies should be able to deny service to people they dislike? ISPs? Imagine Comcast said they won't provide me with service any more because they don't like what I write. Is that okay? It's not as simple as you make it out to be. That's what Konstantinos is noting here.

      They have not actually thought through all of the ramifications of what they're allegedly pondering before they make their proposals.

      I'm going to argue, at least in the case of Konstantinos that line applies much more to what you have said than to what he said. He has spent more time than most working on this very issue. Day in, day out.

      And now we see the biggest snafu yet, the conflation I mention in the first paragraph

      It's not a conflation. He's accurately noting the impact here. It's on speech. That matters.

      link to this | view in chronology ]

    • identicon
      Konstantinos Komaitis, 5 Oct 2021 @ 1:50am

      I am not sure how you separate the two. Freedom of expression is frequently a necessary component of the rights to freedom of assembly and association -- I guess that's what you are talking about when you say right to associate? -- when people join together for an expressive purpose. All three are protected in international and regional human rights instruments and are considered essential to the functioning of a pluralistic and democratic society. The speech implications are pretty clear and disproportionate to anything that happens up the stack.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.