Our Bipolar Free-Speech Disorder And How To Fix It (Part 1)
from the free-speech-and-social-media dept
When we argue how to respond to complaints about social media and internet companies, the resulting debate seems to break down into two sides. On one side, typically, are those who argue that it ought to be straightforward for companies to monitor (or censor) more problematic content. On the other are people who insist that the internet and its forums and platforms—including the large dominant ones like Facebook and Twitter—have become central channels of how to exercise freedom of expression in the 21st century, and we don't want to risk that freedom by forcing the companies to be monitors or censors, not least because they're guaranteed to make as many lousy decisions as good ones.
By reflex and inclination, I usually have fallen into the latter group. But after a couple of years of watching various slow-motion train wrecks centering on social media, I think it's time to break out of the bipolar disorder that afflicts our free-speech talk. Thanks primarily to a series of law-review articles by Yale law professor Jack Balkin, I now believe free-speech debates no longer can be simplified in terms of government-versus-people, companies versus people, or government versus companies. No "bipolar" view of free speech on the internet is going to give us the complete answers, and it's more likely than not to give us wrong answers, because today speech on the internet isn't really bipolar at all—it's an "ecosystem."
Sometimes this is hard for civil libertarians, particularly Americans, to grasp. The First Amendment (like analogous free-speech guarantees in other democracies) tends to reduce every free-speech or free-press issue to people-versus-government. The people spoke, and the government sought to regulate that speech. By its terms, the First Amendment is directed solely at averting government impulses to censor against (a) publishers' right to publish controversial content and/or (b) individual speakers' right to speak controversial content. This is why First Amendment cases most commonly are named either with the government as a listed party (e.g., Chaplinsky v. New Hampshire) or a representative of the government, acting in his or her government role as a government official, as a named party (e.g. Attorney General Janet Reno in Reno v. ACLU).
But in some sense we've always known that this model is oversimplified. Even cases in which the complainant was nominally a private party still involved government action in the form of enactment of speech-restrictive laws that gave rise to the complaint. In New York Times Inc. v. Sullivan, the plaintiff, Sullivan, was a public official, but his defamation case against the New York Times was grounded in his reputational interest as an ordinary citizen. In Miami Herald Publishing Company v. Tornillo, plaintiff Tornillo was a citizen running for a state-government office who invoked a state-mandated "right of reply" because he had wanted to compel the Herald to print his responses to editorials that were critical of his candidacy. In each of these cases, the plaintiff's demand did not itself represent a direct exercise of government power. The private plaintiffs' complaints were personal to them. Nevertheless, in each of these cases, the role of government (in protecting reputation as a valid legal interest, and in providing a political candidate a right of reply) was deemed by the Supreme Court to represent exercises of governmental power. For this reason, the Court concluded that these cases, despite their superficial focus on a private plaintiff's cause of action, nonetheless fall under the scope of the First Amendment. Both newspaper defendants won their Supreme Court appeals.
By contrast, private speech-related disputes between private entities, such as companies or individuals, normally are not judged as directly raising First Amendment issues. In the internet era, if a platform like Facebook or Twitter chooses to censor content or deny service to a subscriber because of (an asserted) violation of its Terms of Service, or if a platform like Google chooses to delist a website that offers pharmaceutical drugs in violation of U.S. law or the law of other nations, any subsequent dispute is typically understood, at least initially, as a disagreement that does not raise First Amendment questions.
But the intersection between governmental action and private platforms and publishers has become both broader and blurrier in the course of the last decade. Partly this is because some platforms have become primary channels of communication for many individuals and businesses, and some of these platforms have become dominant in their markets. It is also due in part to concern about various ways the platforms have been employed with the goal of abusing individuals or groups, perpetrating fraud or other crimes, generating political unrest, or causing or increasing the probability of other socially harmful phenomena (including disinformation such as "fake news.")
To some extent, the increasing role of internet platforms, including but not limited to social media such as Facebook and Twitter in Western developed countries, as one of the primary media for free expression was predictable. (For example, in Cyber Rights: Defending Free Speech in the Digital Age (Times Books, 1998), I wrote this: "Increasingly, citizens of the world will be getting their news from computer-based communications-electronic bulletin boards, conferencing services, and networks-which differ institutionally from traditional print media and broadcast journalism." See also "Net Backlash = Fear of Freedom," Wired, August 1995: "For many journalists, 'freedom of the press' is a privilege that can't be entrusted to just anybody. And yet the Net does just that. At least potentially, pretty much anybody can say anything online - and it is almost impossible to shut them up.")
What was perhaps less predictable, prior to the rise of market-dominant social-media platforms, is that government demands regarding content may result in "private governance" (where market-dominant companies become the agents of government demands but implement those demands less transparently than enacted legislation or recorded court cases do). What this has meant is that individual citizens concerned about exercising their freedom of expression in the internet era may find that exercising one's option to "exit" (in the Albert O. Hirschman sense) may impose great costs.
At the same time, lack of transparency about platform policy (and private government) may make it difficult for individual speakers to interpret what laws or policies the censorship of their content (or the exclusion of themselves or others) in ways that enable them to give effective "voice" to their complaints. For example, they may infer that their censorship or "deplatforming" represents a political preference that has the effect of "silencing" their dissident views, which in a traditional public forum might be clearly understood as protected by First Amendment-grounded free-speech principles.
These perplexities, and the current public debates about freedom of speech on the internet, create the need for a reconsideration of the internet free speech not as a simplistic dyad, or as a set of simplistic, self-contained dyads, but instead as an ecosystem in which decisions in one part may well lead to unexpected, undesired effects in other parts. A better approach would be to consider internet freedom of expression "ecologically," to consider expression on the internet an "ecosystem," and to think about various legal, regulatory, policy, and economic choices as "free-speech environmentalists," with the underlying goal of protecting the internet free-speech ecosystem in ways that protect individuals' fundamental rights.
Of course, individuals have more fundamental rights than freedom of expression. Notably, there is an international consensus that individuals deserve, inter alia, some kind of rights to privacy, although, as with expression, there is some disagreement about what the scope of privacy rights should be. But changing the consensus paradigm of freedom of expression so that it is understood as an ecosystem not only will improve law, regulation, and policy regarding free speech, but also will provide a model that possibly may be fruitful in other areas, like privacy.
In short, we need a theory of free speech that takes into account complexity. We need to build consensus around that theory so that stakeholders with a wide range of political beliefs nevertheless share a commitment to the complexity-accommodating paradigm. In order to do this, we need to begin with a taxonomy of stakeholders. Once we have the taxonomy, we need to identify how the players interact with one another. And ultimately we need some initiatives that suggest how we may address free-speech issues in ways that are not shortsighted, reactive, and reductive, but forward-looking, prospective, and inclusive.
The internet ecosystem: a taxonomy.
Fortunately, Jack Balkin's recent series of law-review articles has given us a head start on building that theory, outlining the complex relationships that now exist among citizens, government actors, and companies that function as intermediaries. These paradigm-challenging articles culminate in a synthesis is reflected in his 2018 law-review article "Free Speech is a Triangle."
Balkin rejects simple dyadic models of free speech. Because an infographic is sometimes worth 1000 words, it may be most convenient to reproduce Balkin's diagram of what he refers to as a "pluralistic" (rather than "dyadic") model of free speech. Here it is:
Balkin recognizes that the triangle may be taken as oversimplifying the character of particular entities within any set of parties at a "corner." For example, social-media platforms are not the same things as payment systems, which aren't the same things as search engines or standard-setting organizations. Nevertheless, entities in any given corner may have roughly the same interests and play roughly the same roles. End-users are not the same things as "Legacy Media" (e.g., the Wall Street Journal or the Guardian), yet both may be subject to "private governance" from internet platforms or subject to "old-school speech regulation" (laws and regulation) imposed by nation-states or treaties. ("New-school speech regulation" may arise when governments compel or pressure companies to exercise speech-suppressing "private governance.")
Certainly some entities within this triangularized model may be "flattened" in the diagram in ways that don't reveal the depth of their relationships to other parties. For example, a social-media company like Facebook may collect vastly more data (and use it in far more unregulated ways) than a payment system (and certainly far more than a standard-setting organization). Balkin addresses the problem of Big Data collection by social-media companies and others—including the issue of how Big Data may be used in ways that inhibit or distort free speech-- by suggesting that such data-collecting companies be considered "information fiduciaries" with obligations that may parallel or be similar to those of more traditional fiduciaries such as doctors and lawyers. (He has developed this idea further in separate articles both sole-authored and co-authored with Jonathan Zittrain.)
Properly, the information-fiduciary paradigm maps more clearly to privacy interests rather than to free-expression interests, but collection, maintenance, and use of large amounts of user data may be used in free-speech contexts. The information-fiduciary concept may not seem to be directly relevant to content issues. But it's indirectly relevant if the information fiduciary (possibly but not always at the behest of government) uses user data to try to manipulate users through content, or to disclose user content choices to government (for example).
In addition, information fiduciaries functioning as social-media platforms have a different relationship with the users, who create the content that makes these platforms attractive. In the traditional world of newspapers and radio, publishers had a close voluntary relationship with the speakers and writers who created their content, which meant that traditional-media entities had strong incentives to protect their creators generally. To some large degree, publisher and creator interests were aligned, although there are predictable frictions, as when a newspaper's or broadcaster's advertisers threaten to remove financial support for controversial speakers and writers.
With online platforms, that alignment is much weaker, if it exists at all: Platforms lack incentives to fight for their users' content, and indeed may have incentives to censor it themselves for private profit (e.g., advertising dollars). In the same way that the traditional legal or financial or medical fiduciary relationship is necessary to correct possible misalignment of incentives, the "information fiduciary" relationship ought to be imposed on platforms to correct their misaligned incentives toward private censorship. In a strong sense, this concept of information fiduciary is a key to understanding how a new speech framework is arguably necessary, and how it might work.
I've written elsewhere about how Balkin's concept of social-media companies (and others) as information fiduciaries might actually position the companies to be stronger and better advocates of free expression and privacy than they are now. But that's only one piece of the puzzle when it comes to thinking ecologically about today's internet free-speech issues. The other pieces require us to think about the other ways in which "bipolar thinking" about internet free speech not only causes us to misunderstand our problems but also tricks us into coming up bad solutions. And that's the subject I'll take up in Part 2.
Mike Godwin (@sfmnemonic) is a distinguished senior fellow at the R Street Institute.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: 1st amendment, free speech, social media
Reader Comments
Subscribe: RSS
View by: Time | Thread
1st paragraph
yes and yes.
there are forums for everything..If you wish to find them..There are even Honey traps for those that Discus SOME things that are illegal.
Forums Should/must Designate WHAT is to be discussed in the sections. Beyond that, monitoring and keeping things Civil and focused on the subject is about ALL there should be.
FB is interesting and I would only ask 1 caveat.. That Politics and corps NOT Advert or send out requests/Broadcasts..JUST SIT THERE...and those who find them and want to listen, Can.
Other then that FB is fine. its more Democratic them Any nation iv seen/heard about..
[ link to this | view in thread ]
information fiduciary ??
apparently somebody (government?) must "impose" (force) "information fiduciary" on "information platforms" to make them do right
Clear as mud. But sounds like plain old government controls on speech. The author/Balkin take great effort to avoid saying that directly... plain English prose would reveal the prime agenda.
As always, the key question to all social-engineers: Who is the government gonna punish and who is it gonna reward to implement your plan ?
[ link to this | view in thread ]
Re: information fiduciary ??
Basically, speech as an oligarchy.
[ link to this | view in thread ]
Purifying the internet of speach?
The Forums can monitor things pretty well, and if someone gets ABIT weird...make a report to WHOMEVER is needed..or whomever they can find, or the local cop..
REGULATING it means you are going to HIDE IT..each time it is placed on the net, you erase it...and DO NOTHING..
Is that worth it when someone wants to??
Suicide?
Assassinate?
MDK
Blow up the CIA??
There is nothing to say.
We, for some reason, LOVE to force things to disappear. NOT FIX THINGS BEFORE they happen.. Even when you have a way and means to discover it before it happens..
[ link to this | view in thread ]
It just raises the hair on the back of our neck, triggering a bit of a flight-or-fight reaction, when somebody begins talking about 'fixing' our speech.
[ link to this | view in thread ]
Re: information fiduciary ??
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: fun & games
... oh, it's like a mystery novel then -- you feel you must slowly build a story to a dramatic conclusion?
Your audience here is too dumb to absorb your proposal in one concise dose?
That's a phony and deceptive approach to serious issues. IMO you are attempting to soft peddle a rather authoritarian approach to speech restrictions, knowing that it will garner much opposition if openly stated upfront.
You should be able to clearly state your proposal in one short paragraph, if you are sincere.
Stop playing games, please.
[ link to this | view in thread ]
To individuals or small businesses, the industry behemoths might as well be the government because (1) they are orders of magnitude larger, (2) we can rarely say no, and (3) we have little to no direct influence over them.
[ link to this | view in thread ]
Re: Re: fun & games
Well, one guy is.
[ link to this | view in thread ]
Re: Re: Oh, look: Mike "I
[ link to this | view in thread ]
Re: Re: Oh, look: Mike "I'm not a Google shill" Godwin is also
denying "advancing an argument for government control of speech", when that's obviously how taken even here.
Maybe you're just a lousy writer, Godwin. Even Hitler could write better.
By the way, WHO pays you to write? It's clearly hack-work to a spec. How is it you have leisure to write and give it to this tiny little Google-promoting site? HMM?
[ link to this | view in thread ]
Re: Re: Re: Oh, look: Mike "I
[ link to this | view in thread ]
Re: Re: fun & games
No, he's an academic hack paid by the word. He doesn't say who pays him, of course. But clearly it's not Techdirt, nor does site reward him with even many readers. Godwin has to comment on his thread to gin up any interest at all!
I took "Part 1" to be the usual passive-aggressive threat.
[ link to this | view in thread ]
Re: Re: Re: Oh, look: Mike "I'm not a Google shill" Godwin is also
I must say, I never expected Mike Godwin to ever be the subject of a Godwin's Law comparison.
I guess, given a long enough posting history, that it was inevitable.
[ link to this | view in thread ]
Re: Re: fun & games
Publish the entire essay at once, and you will whinge tl;dr. Your flavor of "criticism" is obvious.
[ link to this | view in thread ]
Re: Re: Re: Oh, look: Mike "I'm not a Google shill" Godwin is also
[ link to this | view in thread ]
Re: Re: fun & games
You write: "IMO you are attempting to soft peddle a rather authoritarian approach to speech restrictions, knowing that it will garner much opposition if openly stated upfront." You're not exactly familiar with my work.
You write: "You should be able to clearly state your proposal in one short paragraph, if you are sincere." I think you might be trying to say that I should state my thesis in one short paragraph. But not all essays or academic writings work that way. At any rate, there are proposals in Part 3, which has been published today. (Part 2 was yesterday.)
[ link to this | view in thread ]
Can bipolar free voice disorder be completely eliminated? https://wuxiaworld2.com
[ link to this | view in thread ]
Speech
A rapid-fire speech pattern is one of the most frequent initial signs of bipolar disorder. It usually occurs with other common signs and symptoms, such as increased energy and activity; reduced need for sleep or insomnia; elevated mood; irritability, agitation, or jumpiness; and racing thoughts. www.us-mailing-change-of-address.com
[ link to this | view in thread ]