Bad Section 230 Bills Come From Both Sides Of The Aisle: Schakowsky/Castor Bill Would Be A Disaster For The Open Internet
from the that's-not-how-any-of-this-works dept
It truly is stunning how every single bill that attempts to reform Section 230 appears to be written without any intention of ever understanding how the internet or content moderation works in actual practice. We've highlighted tons of Republican-led bills that tend to try to force websites to host more content, not realizing how (1) unconstitutional that is and (2) how it will make the internet into a giant garbage fire. On the Democratic side, the focus seems to be much more on forcing companies to takedown constitutionally protected speech, which similarly (1) raises serious constitutional issues and (2) will lead to massive over-censorship of perfectly legal speech just to avoid liability.
The latest bill of the latter kind comes from Reps. Jan Schakowsky and Rep. Kathy Castor. Schakowsky has been saying for a while now that she was going to introduce this kind of bill to browbeat internet companies into being a lot more proactive in taking down speech she dislikes. The bill, called the Online Consumer Protection Act has now been introduced and it seems clear that this bill was written without ever conferring with anyone with any experience in running a website. It's the kind of thing one writes when you've just come across the problem, but don't think it's worth talking to anyone to understand how things really work. It's also very much a kind of "something must be done, this is something, we should do this" kind of bill that shows up way too often these days.
The premise of the bill is that websites "don't have accountability to consumers" for the content posted by users, and that they need to be forced to have more accountability. Of course, this leaves out the kind of basic fact that if "consumers" are treated badly, they will go elsewhere, so of course every website has some accountability to consumers: it's that if they're bad at it, they will lose users, advertisers, sellers, buyers, whatever. But, that's apparently not good enough for the "we must do something" crowd.
At best the Online Consumer Protection Act will create a massive amount of silly busywork and paperwork for basically any website. At worst, it will create a liability deathtrap for many sites. In some ways it's modeled after the idiotic policy we have regarding privacy policies. Almost exactly a decade ago we explained why the entire idea of a privacy policy is dumb. Various laws require websites to post privacy policies, which no one reads, in part because it would be impossible to read them all. The only way a site gets in trouble is by not following its privacy policy. Thus, the incentives are to craft a very broad privacy policy that gives sites leeway -- meaning they have less incentive to actually create more stringent privacy protections.
The OCPA basically takes the same approach, but... for "content moderation" policies. It requires basically every website to post one:
Each social media platform or online marketplace shall establish, maintain, and make publicly available at all times and in a machine-readable format, terms of service in a manner that is clear, easily understood, and written in plain and concise language.
That terms of service will require a bunch of pointless things, including a "consumer protection policy" which has to include the following:
FOR SOCIAL MEDIA PLATFORMS.—For social media platforms, the consumer protection policy required by subsection (a) shall include—
(A) a description of the content and behavior permitted or prohibited on its service both by the platform and by users;
(B) whether content may be blocked, removed, or modified, or if service to users may be terminated and the grounds upon which such actions will be taken;
(C) whether a person can request that content be blocked, removed, or modified, or that a user’s service be terminated, and how to make such a request;
(D) a description of how a user will be notified of and can respond to a request that his or her content be blocked, removed, or modified, or service be terminated, if such actions are taken;
(E) how a person can appeal a decision to block, remove, or modify content, allow content to remain, or terminate or not terminate service to a user, if such actions are taken; and
(F) any other topic the Commission deems appropriate.
It's difficult to look at that list and not laugh and wonder if whoever came up with it has ever been anywhere near a content moderation or trust & safety team, because that's not how any of this works. Trust & Safety is an ongoing effort of constantly needing to adjust and change with the times, and there is no possible policy that can cover all cases. Can whoever wrote this bill listen to the excellent Radiolab episode about content moderation and think through how that process would have played out under this bill? If every time you change the policies to cover a new case you have to publicly update your already ridiculously complex policies -- while the new requirements be that those same policies are "clear, easily understood, and written in plain and concise language" -- you've created an impossible demand.
Hell, someone should turn this around and push it back on Congress first. Hey, Congress, can you restate the US civil and criminal code such that it is "clear, easily understood, and written in plain and concise language?" How about we try that first before demanding that private companies be forced to do the same for their ever changing policies as well?
Honestly, requiring all of this be in a policy is just begging angry Trumpists to sue websites saying they didn't live up to the promises made in their policies. We see those lawsuits today, but they're kicked out of court under Section 230... but Schakowsky's bill says that this part is now exempted from 230. It's bizarre to see a Democratic bill that will lead to more lawsuits from pissed off Trumpists who have been removed, but that's what this bill will do.
Also, what "problem" does this bill actually solve? From the way the bill is framed, it seems like Schakowsky wants to make it easier for people to complain about content and to get the site to review it. But every social media company already does that. How does this help, other than put the sites at risk of liability for slipping up somewhere?
The bill then has separate requirements for "online marketplaces" which again suggest literally zero knowledge or experience with that space:
FOR ONLINE MARKETPLACES.—For online marketplaces, the consumer protection policy required by subsection (a) shall include—
(A) a description of the products, product descriptions, and marketing material, allowed or disallowed on the marketplace;
(B) whether a product, product descriptions, and marketing material may be blocked, removed, or modified, or if service to a user may be terminated and the grounds upon which such actions will be taken;
(C) whether users will be notified of products that have been recalled or are dangerous, and how they will be notified;
(D) for users—(i) whether a user can report suspected fraud, deception, dangerous products, or violations of the online marketplace’s terms of service, and how to make such report;
(ii) whether a user who submitted a report will be notified of whether action was taken as a result of the report, the action that was taken and the reason why action was taken or not taken, and how the user will be notified;
(iii) how to appeal the result of a report; and
(iv) under what circumstances a user is entitled to refund, repair, or other remedy and the remedy to which the user may be entitled, how the user will be notified of such entitlement, and how the user may claim such remedy; and
(i) how sellers are notified of a report by a user or a violation of the terms of service or consumer protection policy;
(ii) how to contest a report by a user;
(iii) how a seller who is the subject of a report will be notified of what action will be or must be taken as a result of the report and the justification for such action;
(iv) how to appeal a decision of the online marketplace to take an action in response to a user report or for a violation of the terms of service or consumer protection policy; and
(v) the policy regarding refunds, repairs, replacements, or other remedies as a result of a user report or a violation of the terms of service or consumer protection policy.
Honestly, this reminds me a lot of Josh Hawley's bills, in that it seems that both Hawley and Schakowsky want to appoint themselves product manager for the internet. All of the things listed above are the kinds of things that most companies do already because you need to do it that way. But it's also the kind of thing that has evolved over time as new and different challenges arise, and locking the specifics into law does not take into account that very basic reality. It also doesn't take into account that different companies might not fit into this exact paradigm, but under this bill will be required to act like they do. I can't see how that's at all helpful.
And, it gets worse. It will create a kind of politburo for how all internet websites must be run:
Not later than 180 days after the date of the enactment of this Act, the Commission shall conduct a study to determine the most effective method of communicating common consumer protection practices in short-form consumer disclosure statements or graphic icons that disclose the consumer protection and content moderation practices of social media platforms and online marketplaces. The Commission shall submit a report to the Committee on Energy and Commerce of the House of Representatives and the Committee on Commerce, Science, and Transportation of the Senate with the results of the study. The report shall also be made publicly available on the website of the Commission.
Yeah, because nothing works so well as having a government commission jump in and determine the "best" way to do things in a rapidly evolving market.
Also, um, if the government needs to create a commission to tell it what those best practices are why is it regulating how companies have to act before the commission has even done its job?
There are a bunch more requirements in the bill, but all of them are nitty gritty things about how companies create policies and implement them -- something that companies are constantly changing, because the world (and the threats and attacks!) is constantly changing as well. This bill is written by people who seem to think that the internet -- and bad actors on the internet -- are a static phenomena. And that's just wrong.
Also, there's a ton of paperwork for nearly every company with a website, including idiotic and pointless requirements that are busywork, with the threat of legal liability attached! Fun!
FILING REQUIREMENTS.—Each social media platform or online marketplace that either has annual revenue in excess of $250,000 in the prior year or that has more than 10,000 monthly active users on average in the prior year, shall be required to submit to the Commission, on an annual basis, a filing that includes—
(A) a detailed and granular description of each of the requirements in section 2 and this section;
(B) the name and contact information of the consumer protection officer required under subsection (b)(4); and
(C) a description of any material changes in the consumer protection program or the terms of service since the most recent prior disclosure to the Commission(2) OFFICER CERTIFICATION.—For each entity that submits an annual filing under paragraph (1), the entity’s principal executive officer and the consumer protection officer required under subsection (b)(4), shall be required to certify in each such annual filing that—
(A) the signing officer has reviewed the filing;
(B) based on such officer’s knowledge, the filing does not contain any untrue statement of a material fact or omit to state a material fact necessary to make the statements, in light of the circumstances under which such statements were made, not misleading;
(C) based on such officer’s knowledge, the filing fairly presents in all material respects the consumer protection practices of the social media platform or online marketplace; and
(D) the signing consumer protection officer—(i) is responsible for establishing and maintaining safeguards and controls to protect consumers and administer the consumer protection program; and
(ii) has provided all material conclusions about the effectiveness of such safeguards and controls.
So... uh, I need to hire a "consumer protection officer" for Techdirt now? And spend a few thousand dollars every year to have lawyers (and, most likely a new bunch of "compliance consultants" review this totally pointless statement I'll need to sign each year? For what purpose?
The bill also makes sure that our courts are flooded with bogus claims from "wronged" individuals thanks to its private right of action. It also, on top of everything else, exempts various state consumer protection laws from Section 230. That's buried in the bill but is a huge fucking deal. We've talked about this for years, as various state Attorneys General have been demanding it. But that's because those state AGs have a very long history of abusing state "consumer protection" laws to effectively shake down companies. A decade ago we wrote a definitive version of this in watching dozens of state attorneys general attack Topix, with no legal basis, because they didn't like how the company moderated its site. They were blocked from doing anything serious because of Section 230.
Under this bill, that will change.
And we've seen just how dangerous that can be. Remember how Mississippi Attorney General Jim Hood demanded all sorts of information from Google, claiming that the company was responsible for anything bad found online? It later came out (via the Sony Pictures hack) that the entire episode was actually funded by the MPAA, with Hood's legal demands written by the MPAA's lawyers, as part of Hollywood explicit plan to saddle Google with extra legal costs.
Schakowsky's bill would make that kind of corruption an every day occurrence.
And, again, the big companies can handle this. They already do almost everything listed anyway. All this really does is saddle tons of tiny companies (earning more than $250k a year?!?) with ridiculous and overly burdensome compliance costs, which open them up to not just the FTC going after them, but any state attorney general, or any individual who feels wronged by the rules.
The definitions in the bill are so broad that it would cover a ton of websites. Under my reading, it's possible that Techdirt itself qualifies as a "social media platform" because we have comments. This is yet another garbage bill from someone who appears to have no knowledge or experience how any of this works in practice, but is quite sure that if everyone just did things the way she wanted, magically good stuff would happen. It's ridiculous.
Filed Under: busywork, consumer protection, content moderation, ftc, jan schakowsky, kathy castor, paperwork, private right of action, section 230, terms of service