Removing Civil Rights Law From Section 230 Will Create Many New Problems, While Failing To Fix Existing Ones
from the wrong-approach dept
We've covered so many bad faith bills that are attempting to undermine Section 230 for silly and disingenuous reasons. However, I expect we'll be seeing many more bills coming up that actually mean well, and have good intentions underlying the bill... but are still problematic and may make things worse. A new example of this is a not-yet-introduced bill from Rep. Yvette Clarke, along with Rep. Mike Doyle. They've released a "discussion draft" of the bill which they've dubbed the Civil Rights Modernization Act of 2021. This bill does two things that so many Section 230 reform bills do not: (1) it appears to attack an actual, clearly stated problem, and (2) it attempts to take a narrow approach to it.
Unfortunately, as currently written, the bill fails to deal with the actual problems, and is likely to create a wide variety of unintended consequences that do a lot more harm than good.
The idea behind the bill is simple: it's to add yet another exemption to Section 230, such that it would no longer apply to civil rights law in one specific situation: when dealing with targeted advertising. This bill comes almost directly in response to a report from ProPublica years ago showing that because of Facebook's ad targeting tools, landlords were able to exclude users by race. This is horrific and bad, and takes the world back to decades of horrific and regrettable US history where redlining was the norm, and communities were designed (with support of the government) to exclude people of color. Civil rights laws were supposed to help end that practice, and it's completely understandable to be horrified to see that Facebook may have been inadvertently bringing it back.
Of course, after that report came out, Facebook promised to update its policies and tools to deal with this, explicitly banning discriminatory practices within its ads and promising more enforcement against such ads. Of course, as we know, content moderation at scale is impossible, and a follow-up report by ProPublica a year later... found the problem still existed. Facebook blamed a "technical failure" on missing those ads, but... yeah... not a good look by Facebook.
Another year and a half after that, Facebook once again announced changes to its policies for dealing with discrimination in advertising, noting that it came after a bunch of civil rights organizations had sued the company over the discriminatory ads. This was part of a settlement of the lawsuit with those groups. Of course, just a week and a half later, Facebook got hit with another lawsuit, this time from the US government over these same discriminatory ads.
Meanwhile, last summer, the Markup found... the same type of discriminatory ads on Facebook. So, whatever Facebook is doing, it hasn't been able to solve this issue.
Given all of that, it might seem totally reasonable to argue that this bill makes sense. But, if you start to peel back the layers, that does not appear to be the case, and this bill might do a lot more harm than good. First off, let's go back to the core reason why Section 230 exists in the first place: to put liability in the right place. There is nothing, right now, that stops anyone from properly holding landlords who advertise in a discriminatory fashion liable. Indeed, if they're the ones doing the targeting in this manner, it seems only appropriate to correctly accuse them of violating Fair Housing laws. And, if you go after many of them for abusing targeting tools in this manner, that will hopefully get rid of much of the problem simply by convincing the ad buyers themselves to avoid such discriminatory and disgusting practices.
But it gets worse from there. As Public Knowledge pointed out in an article last year, holding a platform liable for some types of speech can lead to significant suppression of important and useful speech:
While an unpopular opinion among some, the fundamental ideas behind Section 230 around third party speech are still sound. Online platforms are not like publishers that can vet and stand behind every user post, and we want online platforms to have a free hand to moderate content without fear of liability for what they take down. A regime where platforms are responsible for third-party discriminatory conduct could very easily make platforms chill the speech of their users for fear of liability. We have evidence that this would likely be the case as seen in platform’s struggles to curb COVID-19 misinformation. Current content moderation AI is not as sophisticated as some of the platforms would like us to believe, especially when moderating the content of BIPOC people. Complicating this even further is that the roles of the platform and the user (employer, realtor, financial institution etc.) are not always clear. Did the user engage in the discriminatory action with the tools provided by the platform or did the platform present discriminatory tools to an unknowing user? A recent study showed that even when given neutral advertisements, Facebook showed different ads to different groups at different rates even when controlled for population, which highlights that even under the best intentions there may be a need to prioritize the platform’s liability as opposed to the third-party content of the advertiser or user.
And, let's be realistic about what's likely to happen if a bill like this became law. The threat of liability in a realm that, as noted in the paragraph above, is effectively impossible to deal with, would lead to a vast overreaction and clamping down of incredibly useful tools -- and could do more harm than good for the very people it seeks to help and support. For example, in the past few years (in part thanks to the power of the internet) a large number of new companies have sprung up that provide healthy & beauty supplies, with a target of serving people of color who are often not as well served by the market.
It's not difficult to see that, as a result of this law, Facebook and others completely block out the ability to effectively target audiences like this, even when it's totally appropriate and non-discriminatory. But the risk of liability may be too high for internet websites, and therefore, you end up back in the regrettable world where the default advertisement targets a white middle-class consumer (as it has for decades) because anything more accurately targeted... runs the risk of liability under such a law.
Another way to think of this, is that there are times when it is entirely non-nefarious to target members of a specific community with ads based on a particular characteristic. If you're selling Passover Hagadadahs, you tend to want to target a Jewish population. You might advertise in Jewish magazines or publications. That's not violating civil rights law, and it wouldn't if you ran those advertisements aimed at Jewish people online either. But, because of the very risk of liability, websites might ban all such targeted advertising entirely, leading again to end result where those niche communities are underserved, because the only ads you can place are targeted at the most mainstream, least common denominator audiences.
That seems like the opposite of what people who support civil rights should want.
And then there's a very serious question of whether or not Section 230 is even a problem here in the first place. As noted above, Facebook has already been sued multiple times by both civil rights organizations and the government over the ads. And while it's true it has tried to use 230 in response to the HUD lawsuit, we already have a somewhat similar case on the books, which is considered one of the key Section 230 cases. In Fair Housing Council of San Francisco v. Rommates.com, the 9th Circuit found that Roommates was not protected by Section 230 for discriminatory content that it created. In that case, also involving housing and race, Roommates created a pull down menu letting users select a preferred race of a roommate. And the court found that since that pulldown was created by the company and not a 3rd party, it was not immune from the lawsuit. (For what it's worth, an oft-forgotten coda to this story is that even though the court rejected Roommates' Section 230 defense, years later, it found that the company had not actually violated fair housing discrimination rules).
There's one other reason to be very wary of this civil rights carveout for Section 230: there's a very, very big chance that rather than being used to crack down on discrimination in housing, it would be abused by white nationalists to demand access to services that don't want them. If you look at the actual case law of where civil rights claims have been brought in an attempt to get around Section 230, you find a series of highly questionable cases -- including a Russian internet troll farm, a proudly misogynistic video blogger, a Twitter user who lost his account for tweeting hateful content directed at Daily Show host Trevor Noah, a guy who claims he lost his Twitter account for expressing his "heterosexuality and Christian affiliation," and a well known white supremacist -- all of whom claimed their civil rights were violated by being removed from social media.
Most of those claims were rejected on Section 230 grounds, but opening up the possibility of removing civil rights law from Section 230's protections may lead to a flood of similar lawsuits from truly awful people who are mad that they were removed from social media for hateful views, and claiming that such removals violate their civil rights.
Again, the bill almost certainly comes from a place of good intentions. And there are quite reasonable concerns about how Facebook's targeting in particular has been used to discriminate in housing, and possibly in other venues as well (such as jobs). But that's not a problem that we need 230 to fix. Instead, you'd think that a smarter approach would be to go after those doing the actual discrimination.
In summary, this bill is:
- Not clearly needed / mistargeted
- Likely to harm marginalized communities by limiting some of their own perfectly reasonable advertising abilities
- Already dealt with in the Roommates case
- Likely to be abused by terrible, terrible people, to claim their hateful views are being discriminated against.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: civil rights, discrimination, intermediary liability, mike doyle, section 230, yvette clarke
Companies: facebook
Reader Comments
Subscribe: RSS
View by: Time | Thread
There are probably thousands of those, spread across the various states, while there is only one FaceBook. It is obviously more efficient to go after the one.
Sadly the above is not sarcasm.
[ link to this | view in thread ]
Once you start making special case "carve outs" there will be no end to it. The simple, straightforward description of rights and responsibilities as they apply to websites and user-generated content that is Section 230 will turn into a nightmarish maze with dangerous minefields that no one will be able to navigate, or even be able to afford to risk trying to navigate, except the Google / Facebook / Twitter types, and even they will be hobbled, as Mike has pointed out.
Also, Passover Hagadadahs?
[ link to this | view in thread ]
Or, to put it more bluntly: Removing civil rights law from Section 230 will make it another “no fly” list.
[ link to this | view in thread ]
Re:
Plus, Facebook has oodles of money.
[ link to this | view in thread ]
Section 230 requires so many carveouts that we should just nuke it. Fair-housing and employment discrimination in classified advertising shouldn't convey any immunity but they have in well-known cases. There's nothing wrong with eliminating this. Whistleblowers also get retaliated against by defamation and would likely gain protections while 230 could remain on the books largely untouched otherwise.
The US is the only country with Section 230, and other countries like Britain, Canada, India, Australia, and Germany, to name several, do fine without it, instead using a DMCA-style notice-and-takedown system which obviously works well enough.
There most definitely are minor errors which cause stuff to be taken down (like a fair-use) but when you're going after a mass-piracy site that sells 1,000 stolen books and destroys revenue for hard-working authors (while profiting from crime which is also bad), but those errors can be fixed.
[ link to this | view in thread ]
Re:
In most other countries, the person or company breaking the law is the one held responsible, While in the US they are trying to make the easiest target with money responsible for crimes committed by somebody else.
[ link to this | view in thread ]
Re:
Name one.
Yeah, and it's the only country with the US constitution also. What's your point? Or perhaps you really believe that the countries you mentioned doesn't have their own laws that fill the same purpose?
Oh, it's you. That explains the dishonest arguments and the burning strawmen waltzing around.
[ link to this | view in thread ]
Just an FYI, Jhon: If 230 gets nuked, you’ll likely lose every online outlet for your ridiculous bullshit. And I doubt many people in meatspace will want to hear it. So keeping 230 alive is actually in your best interests.
[ link to this | view in thread ]
Good on facebook for trying to prevent this, but just because it's not impossible to do in facebook doesn't mean it's facebook's fault when someone does it. It's possible to do illegal things with all sorts of apps and services. You don't blame Ford for people breaking the law because they make cars capable of driving too fast through school zones or through red lights without stopping.
[ link to this | view in thread ]
We saw what happened with FOSTA/SESTA and that was an absolute disaster
[ link to this | view in thread ]
Re:
You know, Jhon, talking about your Rose McGowan fantasies isn't going to help that Masnick press release you keep threatening everyone with.
[ link to this | view in thread ]
Re:
instead using a DMCA-style notice-and-takedown system which obviously works well enough.
lol
I voted "funny"
[ link to this | view in thread ]
Re:
"The US is the only country with Section 230,"
Yes, because - as has been explained to you many, many time - the US is the only country where it's even possible to sue a platform for things that someone else did on their property. The US system is so bad that what's naturally implicit elsewhere had to be spelled out explicitly there.
"mass-piracy site"
Oh, and you're still on your ignorant "shoehorn piracy into things that have nothing to do with piracy" crusade? Aren't you sick of those lies yet?
[ link to this | view in thread ]
The premise of this argument is that targeted ads, the way FB does this, outweighs any and all restrictions that would limit the general utility of this targeting process. This is what has led not only to FB profiting from racial discrimination and inciting hate crimes (by using its amazingly effective AI to direct ads where they will be the most effective), but also, to FB creating the bubble chamber where people only hear what they want. TechDirt consistently flacks for FB's right not to be just a platform but to offer products that enhance social harm without any liability, as if, the core value of free expression were to allow a corporation to make as much money as possible out of advertising technology, regardless of whatever harm the advertising causes, and regardless how crucial a role FB's ad targeting played in actually inciting harm and violence.
You say, but FB can't tell the difference between calls for race war and selling Passover hagadahs. Oh please. FB has the most sophisticated, nuanced ad targeting system in the world, capable of directing ads for just about any product or cause to the people who are responsive. FB's gargantuan profits come from its ability to do precise and effective targeting. But, says TechDirt, just, not when it comes to differentiating between inciting racist and other harmful conduct, and advertising for non-harmful products.
One last point -- incitement of violence and other harm is an externality of FB ad targeting, in practicality, borne by the victims of this harm. Making FB pay for harm its ads incite would simply make FB advertising more expensive. The argument TechDirt is making boils down to, the social good provided by cheap advertising (and gigantic profits for FB) justifies imposing this cost on victims of harm caused by FB's profit-motivated assistance in promulgating the message. If it can be shown legally that FB's activity was a proximate cause of the harm, is this really a defensible position?
[ link to this | view in thread ]
Re:
"The US is the only country with Section 230, and other countries like Britain, Canada, India, Australia, and Germany, to name several, do fine without it"
I thought those were nasty socialist places we do not like - are we now supposed to be more like those damned commies? I think not!
[ link to this | view in thread ]
Re: Re:
The Googles, Facebooks and Amazons based in Britain Canada India and Australia are doing just awesome without section 230... The issue hasn't even come up!
[ link to this | view in thread ]
Re:
It's not smarter because it's easier it's smarter because it's targeted at the actual wrongdoing.
Yes there is only one facebook, but facebook isn't doing anything wrong.. removing facebook from the equation just means the people actually doing what you are trying to stop will go undeterred and can continue doing it somewhere else.
[ link to this | view in thread ]
you actually don't know if the "proudly misogynistic video blogger" is misogynistic at all, Goldman states explicitly he didn't review the youtube channel which was titled "Misandry today"
Having stated that, Goldman, 230 defender goes on bizarrely to worry about what demons he projects onto the channel
[ link to this | view in thread ]
It is interesting to discover that Roommates didn't need 230 at all.
Perhaps many such 230 suits would resolve just fine and in the typical manner if there were no 230
[ link to this | view in thread ]