The Internet Is Not Just Facebook, Google & Twitter: Creating A 'Test Suite' For Your Great Idea To Regulate The Internet
from the test-it-out dept
A few weeks ago, Stanford's Daphne Keller -- one of the foremost experts on internet regulation -- highlighted how so much of the effort at internet reform seems to treat "the internet" as if it was entirely made up of Facebook, Google and Twitter. These may be the most visible sites to some, but they still make up only a small part of the overall internet (granted: sometimes it seems that Facebook and, to an only slightly lesser extent, Google, would like to change that, and become "the internet" for most people). Keller pointed out that the more that people -- especially journalists -- talk about the internet as if it were just those three companies, the more it becomes a self-fulfilling prophecy, in part because it drives regulation that is uniquely focused on the apparently "problems" associated with those sites (often mis- and disinformation).
I was reminded of this now, with the reintroduction of the PACT Act. As I noted in my writeup about the bill, one of the biggest problems is that it treats the internet as if every website is basically Google, Facebook, and Twitter. The demands that it puts on websites aren't a huge deal for those three companies -- as they mostly meet the criteria already. The only real change it would make for those sites is that they'd maybe have to beef up their customer support staff to have telephone support.
But for tons of other companies -- including Techdirt -- the bill is an utter disaster. It treats us the same as it treats Facebook, and acts like we need to put in place a massive, expensive customer service/content moderation operation that wouldn't make any sense, and would only serve to enable our resident trolls to demand that we have to provide a detailed explanation why the community voted down their comments.
In that same thread, Keller suggested something that I think would be quite useful. Saying that there should be a sort of "test suite" of websites that anyone proposing internet regulation should have to explore how the regulations would effect those sites.
We should have a standard test suite, against which all factual claims about "platforms" or "intermediaries" can be vetted.
- Wikipedia
- CloudFlare
- Automattic
- https://t.co/AEOaZAxcFv
- https://t.co/5CKnS7E5FN
- Others?— Daphne Keller (@daphnehk) February 22, 2021
She suggested that the test suite could include Wikipedia, Cloudflare, Automattic, Walmart.com and the NY Times.
I'd extend that list significantly. Here would be mine:
- Wikipedia
- Github
- Cloudflare
- Zoom
- Clubhouse
- Automattic
- Amazon
- Shopify
- NY Times / WSJ
- Patreon
- Internet Archive
- Mastodon
- Nextdoor
- Steam (Valve)
- Eventbrite
- Discord
- Dropbox
- Yelp
- Twilio
- Substack
- Matrix
- Glitch
- Kickstarter
- Slack
- Stack Overflow
- Notion
- Airtable
- WikiHow
- ProductHunt
- Instructables
- All Trails
- Strava
- Bumble
- Ravelry
- DuoLingo
- Shapeways
- Coursera
- Kahoot
- Threadless
- Bandcamp
- Magic Cafe
- Wattpad
- Figma
- LibraryThing
- Fandom
- Geocaching
- VSCO
- BoardGameGeek
- DnDBeyond
- GuitarMasterClass
- Metafilter
- BoingBoing
- Cameo
- OnlyFans
- Archive of Our Own
- Itch.io
- Etsy
- Tunecore
- Techdirt
And... that's kind of the point. The great thing about Section 230 is that it allows each of these websites to take their own approach to content moderation, an approach that fits their community. Some of them rely on users to moderate. Some of them rely on a content moderation team. But if you ran through this list and explored something like the PACT Act -- or the even worse SAFE-TECH Act -- you quickly realize that it would create impossible demands for many, many of these sites.
Incredibly, all this would do is move most of the functions of many of these sites -- especially the small, niche, targeted communities... over to the internet giants of Facebook and Google. Does anyone legitimately think that a site like LibraryThing needs to issue twice-a-year transparency reports on its content moderation decisions? Or that All Trails should be required to set up a live call center to respond to complaints about content moderation? Should Matrix be required to create an Acceptable Use Policy? Should the NY Times have to release a transparency report regarding what comments it moderated?
For many of the companies -- especially the more niche community sites -- the likely response is that there's no way that they can even do that. And so many of those sites will go away, or will vastly curtail their community features. And, that takes us right back to the point that we started with, as raised by Keller. When we treat the internet as if it's just Facebook, Google, and Twitter, and regulate it as such, then it's going to drive all communities to Facebook, Google, and Twitter as the only companies which can actually handle the compliance.
And why would anyone (other than perhaps Facebook, Google, and Twitter!) want that?
Filed Under: internet, regulating the internet, section 230, test suit
Companies: facebook, google, twitter