Senator Amy Klobuchar Says She Has A Bill To Hold Facebook Responsible For Vaccine Disinfo; But What Would The Cause Of Action Be?
from the that's-not-how-any-of-this-works dept
Earlier this week, appearing on The View, Senator Amy Klobuchar was asked about COVID disinformation, and gave a pretty bizarre answer. Responding to a question about how fighting COVID has been politicized by Fox News, Klobuchar said we should make Facebook responsible. It's really quite an incredible disconnect. The question specifically highlighted how Fox News was the main vector of COVID misinformation, and Klobuchar said this:
And at the same time, the misinformation on the internet, which is something I'm personally taking on is outrageous. These are the biggest richest companies in the world that control these platforms, and they've got to take this crap off. We're in a public health crisis -- we still are -- we've seen major improvement thanks to the vaccines, the ingenuity of people, Biden administration getting this out, but this is holding us back. Two thirds of the people that are not vaccinated believe something that they read on the internet. That's all the facts I need. That's from a Kaiser Foundation Report.
So I'm going to introduce a bill to limit the misinformation on vaccines by saying you guys are liable if you don't take it off your platforms.
The next speaker then joked that Facebook is "more likely to remove a breastfeeding shot than some misinformation" which... um... is not even remotely true. It may have been true a decade ago but old talking points are obsolete.
But, the larger point here is make Facebook liable for what exactly? Whether we like it or not, vaccine misinformation is still protected speech under the 1st Amendment. And no bill that Klobuchar can introduce can change the 1st Amendment. So, if you make them "liable," there still is no cause of action because the misinformation itself does not (and cannot) violate any law in the US.
And, as has been pointed out over and over again, it's not as easy as everyone makes it out to be for these sites to just snap their fingers and make such misinformation disappear. Everyone thinks it is because they've never had to do it themselves, especially not at the scale of a Facebook. First, you need clear, easily understood definitions of what qualifies as vaccine misinformation that are easily explained to tens of thousands of human moderators. Then you need to train them how to recognize what is actually misinformation -- and not someone just commenting about vaccines (including by people who might not be experts, and might get some small things wrong). Then you need to set parameters for what kinds of misinformation should actually lead to what responses. Do you shut down accounts entirely? Do you give people warnings? Do you make them delete specific content? Then you have to deal with levels of misinformation. How do you deal with someone who presents something that is technically factual, but placed in a warped context, such that it implies something false? How about someone who presents incomplete information? Or someone who presents factual information, but their interpretation of it is incorrect? How do you know who is doing it deliberately and who is just unclear?
Then you have to deal with the false positives (of which there will be many -- including people trying to spread counter-info to respond to those spreading disinformation). Then you have to recognize how disinformation strategies will continue to evolve over time, and how those with a vested interest in spreading such information will change their tactics, so whatever worked yesterday won't work tomorrow. Then you have to recognize that you're still going to miss a massive amount of the content, because you have 2.8 billion users around the globe, and no company, no matter how big, and with however many AI bots and human content moderators, can ever possibly review all of it... and you're putting companies in an impossible position.
Demanding the impossible is not good policy.
Can Facebook do a better job of all of this? Of course. Has the company been flippant and silly in the past in responding to controversies over content moderation. Absolutely. But demanding the impossible and threatening unconstitutional regulatory responses for failing seems... counterproductive?
Filed Under: 1st amendment, amy klobuchar, cause of action, content moderation, disinformation, section 230, vaccine disinformation
Companies: facebook