Ricky Byrdsong And The Cost Of Speech
from the consequences-of-algorithmic-polarization dept
On July 2nd,1999, Ricky Byrdsong was out for a jog near his home in Skokie, Illinois, with two of his young children, Sabrina and Ricky Jr. The family outing would end in tragedy. His children watched helplessly as their father was gunned down. He was the victim of a Neo-Nazi on a murderous rampage targeting Jewish, Asian and Black communities. Ten other people were left wounded. Won-Joon Yoon, a 26 year-old graduate student at the University of Indiana, would also be killed.
When you distill someone's life down to their final minutes, it does a disservice to their humanity and how they lived. Though I didn't know Won-Joon Yoon, I met Coach Byrdsong — one of few Black men's head basketball coaches in the NCAA — through my father, who is also part of this small fraternity. As head coaches in Illinois in the late 90s, their names were inevitably linked to each other. They occasionally played one another. Beyond his passion for basketball, Coach Byrdsong's love of God, and his commitment to community and family shone bright.
Coach Byrdsong was the first Black head basketball coach at Northwestern University in Evanston, Illinois. His appointment was a big deal: Northwestern is a private university in an NCAA "power conference," with a Black undergraduate population of less than 6%. I visited Northwestern's arena when my dad was an assistant coach at the University of Illinois. At 11-years old, I remember being surrounded by belligerent college students making ape noises. When I hear jangling keys at sporting events, I'm transported back to the visceral feeling of being surrounded by thousands of (white) college students, alumni and locals, shaking their car keys while smugly chanting "that's alright, that's ok, you will work for me one day."
Their ditty, directed towards a basketball court overwhelmingly composed of Black, working-class student athletes, seemed to say: you don't belong here, and you never will — a sentiment that still saturates the campus. This is the world that Neo Nazi Benjamin Smith came from. Smith was raised in Wilmette, Illinois, one of the richest and whitest suburbs in the country, less than five miles from where he killed Coach Byrdsong.
The digital boundaries that exist online, much like the neighborhood ones, carve up communities often by ethnicity, class, and subculture. In these nooks a shared story and ideology is formed that reinforces an "us against the world" mentality. It's debatable whether that's intrinsically bad — but in this filter bubble, it is hard to see our own reflection accurately, let alone others. This leaves both our digital and physical bodies vulnerable.
Matthew Hale, Smith's mentor and founder of the World Church of the Creator, was an early adopter of Internet technology. He was part of a 90s subculture of white nationalists that flocked to the web, stitching a digital hood anonymizing those who walk and work amongst us. Hale's organization linked to white power music, computer games, and developed a website "Creativity for Kids," with downloadable white separatist coloring books. They used closed chat rooms and internet forums to rile up thirst for a race war. They understood the importance of e-commerce as a vehicle for trafficking hate, and they experimented with email bombing and infiltrating chat rooms.
Beyond being tech savvy, Hale was also a lawyer, who in 1999 was being defended by the ACLU. The Illinois Bar Association had denied Hale's law license based on his incitement of racial hatred and violence against ethnic and religious groups. The ACLU has had a long run of defending white nationalists including Charlottesville "Unite the Right" organizer Jason Kessler. In 1978 they defended the organizers of a Skokie Nazi march, the same community where Coach Byrdsong was assassinated. At the time 1 in every 6 Jewish residents there was either a survivor, or directly related to a survivor of the Holocaust.
Hale's law license was rejected based on three main points:
-
That a lawyer's responsibility to uphold equal justice for all, is compromised when their sworn allegiance to one race comes before the greater good.
-
That freedom of speech did not offer protection from the consequences of speech, nor did it mean validation and accreditation for dangerous speech, which an approved law license would imply.
-
That the community standards, values and guidelines set by the law association gave the judging body the ability to define based on a set of socially accepted criteria, what goes outside the bounds of decency and morality. From their standpoint, being an avowed Klansman, given their history of lynchings and terrorism, clearly crossed that line.
But Hale felt entitled to the right to speech with impunity. And the ACLU mounted a vigorous defense of that. What rights we are owed by institutions, is a question that divides even those that are normally aligned. Though tech companies say their role is to ensure that all speech matters, free speech does not mean the right to content or user validation, amplification or freedom to incite violence. This is why groups that are deemed terrorist threats by companies don’t have the same stranglehold that white nationalists enjoy. That they remain so openly prevalent speaks more to a tacit societal acceptance of white nationalist ideologies than it does the matter of free speech.
Free speech as a red herring is used as a wedge to politicize universal rights conversations, falsely brand common sense content moderation policies as "anti-" speech, freedom, and liberties. It's an argument that's routinely manipulated to shield corporations that get rich off of if it bleeds it leads incentive structures but aren't willing to accept the responsibility that comes from being about that life. Moreover, it chills the ability of marginalized groups to advocate for safer spaces.
Already, indoctrination can occur organically by nature of the interest-driven algorithmic models that permeate the platforms and trap people into digitally segregated neighborhoods. I put these algorithms to the test when I started using a device to research white nationalist communities. I found that when the computer read my data profile, it fundamentally changed my user experience. "Hey you follow Mike Cernovich," Twitter would note, "try following David Duke or David Horowitz." Amazon would see I was looking up books about Lauren Southern and make sure I knew I could also buy Mein Kampf. I was bombarded with YouTube conspiracy videos about the deadly Blacks in Chicago. With no fact check mechanism in sight, my world was shaped by who the internet thought I was. Once relegated to that bubble, it was hard to get out.
Institutions like the Harvard Shorenstein Center and Data and Society document how hate groups build power online. According to the Southern Poverty Law Center, there are over 1,000 hate groups active in the United States. Beyond these known hate groups there are many loosely organized factions and countless sympathizers that are one bad quarantine day away from unleashing.
In our constricted online universe ruled by clickbait gods and monsters, it takes a special type of disassociation to say that words have the power to change the world yet there should be no boundary to them. Every morning there is another father, mother, son, daughter, or person tying up their shoes to go out for a jog, a person who could be the next Ricky Byrdsong or Ahmaud Arbery. Somewhere online at this very moment another Benjamin Smith is being groomed for violence. So many of them exist that their story may never make it out of their community filter bubble. According to 2018 data from the FBI there were 7,120 hate crime incidents reported that year. Often hate crimes — particularly against women of color and trans people — go unreported. While there's a slight dip in the total number of hate crimes overall, according to the FBI, the severity of violence is getting worse.
In 2018 Change the Terms, a coalition of 50 civil and human rights organizations, released a set of model corporate principles for dealing with hate speech. Color Of Change helped develop the framework, which encourages transparency, right to appeal, and baseline criteria for defining hate speech. The coalition is composed of either represented constituencies disproportionately targeted for hate crimes, or organizations that emphasize racial, gender, identity or religious justice in their mission. While the organizations have helped lead a culture shift in Silicon Valley, there is still much work to be done.
Matthew Hale, the pied piper of amateur racists, uses his Internet platform to lure in people like Benjamin Smith. When testifying in support of Hale's petition for a law license, Smith told the court "He's given me spiritual guidance...When I first met him, I wasn't really sure what I wanted to do with my life, what direction I was going to go." Weeks later Smith killed two people before committing suicide. From prison Hale churns out hate propaganda you can buy on Amazon.
The adage "I don't agree with what you say but I defend to the death your right to say it," is often used to prop verbal violence. But history shows that disproportionately Black and Brown bodies get sacrificed in the name of boundless white speech. For some, Matthew Hale is a cause worth theoretically dying for. Reminders of the lives he had a hand in snuffing out are often met with a digital shoulder shrug on Twitter. It's as though everyone murdered for simply existing are merely the proverbial spilt milk, splashed on the floor for the greater good of the communities left to mourn them.
Rarely have these free speech martyrs actually suffered the life and death consequences of their absolutism. I begrudge anyone their righteous cause who knows what it means to mourn yet still believe. But over Independence Day weekend in 1999, Coach Ricky Byrdsong and Won-Joon Yoon did die. The online legacy left by their assassin and his enablers would empower and inspire a killer to walk into Emanuel African American Methodist church and murder nine people, another one to massacre 11 in the Tree of Life synagogue in Pittsburgh and yet another to gun down 49 people in New Zealand in the Muslim community of Christchurch.
Each of them set in motion the real-world consequences of an online model that monetizes polarization. And all of them left behind digital breadcrumbs for the next monster. Coach Byrdsong is one of multiple stories I could share about someone I knew who was killed or assaulted because of their ethnicity, gender identity, sexuality or religion. For me, free speech can not truly be free when it operates on a sliding scale weighted against those with the most to lose, and it will never hold more intrinsic value than human lives.
Brandi Collins-Dexter is a visiting fellow at the Harvard Kennedy School’s Shorenstein Center on Media, Politics and Public Policy and a senior fellow at Color Of Change. She is currently writing a book about Black participation in democracy and the US economy, with particular focus on the role technology and information integrity play in improving or deteriorating community health.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: benjamin smith, content moderation, filter bubbles, free speech, hatred, indoctrination, matthew hale, racism, ricky birdsong, won-joon yoon
Reader Comments
Subscribe: RSS
View by: Thread
Hale is a convicted felon
On April 6, 2005, Hale was sentenced to a 40-year prison term exactly one year after the trial began for attempting to solicit Lefkow's murder. U.S. District Court Judge James Moody presided over the sentencing. During the trial, jurors heard more than a dozen tapes of Hale using racial slurs, including one in which he joked about Benjamin Smith's murderous shooting spree. According to prosecutors, Hale had asked one of his followers named Anthony Evola to kill Lefkow.[21] In June 2016, Hale was transferred out of ADX Florence to medium-security federal prison FCI Terre Haute, Indiana,[22] but by late 2017 was back at Florence.[2] In July 2020, Hale was transferred out of ADX once again, this time to USP Marion, a medium-security institution in Illinois.
Hale's projected release date is April 2, 2037.[23] If he is released at that time, he will be around 66 years old.
[ link to this | view in chronology ]
Blaming the tool?
You made a great and persuasive argument about the negative interactions that the internet can have, but isn't the internet in this case just a microcosm of reality? If you grow up in a place that has a high racial bias, then you have a higher percentage of being racist... you model your environment.
I guess my question is, if we start "enforcing" the "bad" speech, then aren't we inviting the government to dictate what is 'good' and what is 'bad'?
As a counter to why government shouldn't be allowed to do that, look at FOSTA, which criminalizes any site attempting to help people working in the sex trade... EVEN IN PARTS OF THE COUNTRY WHERE IT IS LEGAL.
So, while I understand your argument, agree with you in principle, I can't follow you across the bridge to start letting the government dictate 'good' and 'bad' speech... sometimes what you ask for isn't what you get.
[ link to this | view in chronology ]
No, we’re not. Twitter isn’t a government institution; its admins have every right to determine whether letting bigoted (but legally protected) speech is allowed or disallowed on the service. The government can’t, shouldn’t, and likely wouldn’t dare try to say otherwise — no matter what anyone says about “anti-conservative bias” or whatever.
Lawmakers can decry White supremacists and say the ideology is harmful to everyone, including those who believe in it. But lawmakers can’t force people to stop saying racial slurs on Facebook. Only Facebook can do that — and only Facebook should have the power to do that.
[ link to this | view in chronology ]
Re: Blaming the tool?
I think there's a reasonable argument to make that the Internet has allowed people with fringe views -- like white supremacists -- to propagate their beliefs and recruit more members much more efficiently than they could when David Duke was mailing out newsletters.
I also think that, while it's impossible for online forums the size of Facebook and Twitter to effectively moderate that kind of content, they could do a better job than they have up to this point.
I think that's probably the best argument against hate speech laws, yes -- remember who's in charge of enforcing the law. It's Trump; it's Barr and the DoJ; it's, not to put too fine a point on it, the police. In our current climate, if we had laws against hate speech, it wouldn't mean putting white supremacists in jail; it would mean the white supremacists in charge of enforcing our laws using them as yet another tool to persecute minorities.
[ link to this | view in chronology ]
Re: Blaming the tool?
"I guess my question is, if we start "enforcing" the "bad" speech, then aren't we inviting the government to dictate what is 'good' and what is 'bad'?"
Which is why, at most, government should issue guidelines in the form of foundational education. I.e. have schools include factual descriptions of the usual pitfalls of hatred to ensure the children, on growing up, when hearing a diatribe of hatred, recognize it for what it is.
It's up to the rest of us to enforce rules of ethics on social platforms and speak up rather than turn away and shut up when bigots launch their propaganda.
But education is just the first step.
[ link to this | view in chronology ]
The deaths and injustice you mention is indeed tragic, and is not something that should be supported.
"But history shows that disproportionately Black and Brown bodies get sacrificed in the name of boundless white speech."
There is no such thing as "white speech." Do all people who are white say the same things? Clearly not. So, that's racist. Indeed, if you are talking about "white" anything, you are a racist. And no, that is not ok because black people have historically been oppressed and marginalized.
The solution to racism is not racism, and the solution to violence is not censorship. I don't know what all the answers are, but I know those aren't among them.
[ link to this | view in chronology ]
Re:
It sure sounds like you're willfully misunderstanding the point she's making.
If you want to disagree with her argument, that's fine; I think a lot of people here would disagree with her views on restricting hate speech. But if you're going to disagree with her argument, disagree with her actual argument, not a misrepresentation of it.
And for God's sake don't start whining about how you're the victim of reverse racism; that just makes you look silly.
[ link to this | view in chronology ]
By declaring any speech that you oppose as hate speech, you create a massive loophole to the first amendment.
[ link to this | view in chronology ]
Re:
Was there a government actor in the article that I missed? Only a government actor declaring any speed they don't like to be hate speech and have it be a 1st Amendment violation.
On the other hand, assuming you and I are both not government actors, we can call any speech we like hate speech. But we cannot avoid consequences, such as being kicked off a platform or have many, many, many opposing comments to our statements. Or being wrong.
[ link to this | view in chronology ]
Re: Re:
Was there a government actor in the article that I missed?
While not actually a government body itself, the Illinois Bar Association holds a government granted monopoly on the provision of legal services in Illinois. It is, in Illinois, illegal to practice law without a license... and also illegal for anybody except the Illinois Bar to grant such a license.
Were we to accept the assertion that such a relationship does not make the Illinois Bar a government actor for the purposes of licensing... we might as well not bother with "rights" at all, as it would be trivial for the government to legally violate them.
[ link to this | view in chronology ]
Re: Re: Re:
No, the Illinois Bar Association is not a government actor. You can tell by the word association in their name. In addition, it is not the Bar Association that licenses lawyers in Illinois, "The Attorney Registration and Disciplinary Commission (ARDC or Commission) operates under the authority of the Illinois Supreme Court, which has sole authority to regulate the admission and discipline of lawyers in Illinois.". The Internet is your friend, learn to use it.
[ link to this | view in chronology ]
Re: Re: Re:
"While not actually a government body itself, the Illinois Bar Association holds a government granted monopoly on the provision of legal services in Illinois."
And any government body must act in an impartial manner. The OP describes pretty clearly their grounds for denying Hale's law license;
"That a lawyer's responsibility to uphold equal justice for all, is compromised when their sworn allegiance to one race comes before the greater good. "
Imagine, for one second, trying to take a driver's license while demonstrating that you'll only follow traffic laws while driving within certain areas, or a pilot license when you abide following air traffic guidelines only over certain cities.
The Illinois Bar Association is tasked to handle bar certification without fear or favor, providing it to all who demonstrate a certain quality of standard of impartiality. Even if we make the ruling that the bar acted as a government agency it has abided by the exact standards government is supposed to act by in their judgment.
Hale is free to tell everyone black people are inferior and need to get shot. This disqualifies him from any job which demands impartiality as first criterium. It's that simple.
Needless to say racists will try to bring freedom of speech into it but that's STILL as invalid an argument as trying to bring "liberty and prosperity" into an argument as to why you shouldn't have to pay your 20$ parking ticket.
[ link to this | view in chronology ]
When you don't punish the psychos that you do catch, others will take it as an invitation to do whatever they want whenever the mood strikes them.
[ link to this | view in chronology ]
Which is why deplatforming works.
[ link to this | view in chronology ]
"Not a Government Agency"
[ link to this | view in chronology ]
Do you believe the law should force Facebook to host the speech of (i.e., associate with) former Ku Klux Klan leader David Duke?
[ link to this | view in chronology ]
Re: "Not a Government Agency"
There are multiple private platforms upon which to speak. Even if one silences you, you can speak on another. And on the small chance that they all silence you, can't you still speak elsewhere?
Doesn't sound to me like your ability to speak freely is being affected much at all. Now, if you're upset at the denial of an audience however, that's not the same thing is it?
[ link to this | view in chronology ]
Re: "Not a Government Agency"
"Those who are saying that this isn't a call for censorship and a violation of the 1st Amendment because no government agencies are involved either don't understand or are being deliberately disingenuous."
They're really not because unless you can actually make it illegal to express an opinion both the 1st amendment and free speech are doing fine. Lying about other people being disingenuous while trotting out a false assumption isn't helping your argument here.
"They need to be fought even more vigorously than state censors precisely because we don't have any Constitutional protection from owners of private platforms who would silence us."
No they don't. And of course you don't have constitutional protection from being thrown off another person's property. In fact I'd argue that the constitution is all on the side of the property owner when it comes to who said owner can bar from his premises or not.
But hey, I'm curious to hear your arguments as to why private property must be abolished for free speech to survive, so please, elucidate.
Just why is it that a platform owner - or a bar owner - shouldn't be allowed to set rules of conduct for his/her premises and ban those who refuse to follow said rules?
[ link to this | view in chronology ]
Punishment doesn't work. So many of those mass murderers kill themselves when they are near to being caught. And putting them in jail doesn't work, they just keep on speaking.
These "black and brown bodies" became free speech martyrs because they died because "someone was allowed to speak", whereas other victims of mass murderers were only killed by someone who was mentally ill, or suicidal (often the same thing), or irrationally fearful of "the other" (hmm... we're developing a pattern here).
I sympathize for the mourners, but dead is dead. Treating hate speech as a "free speech" issue is not going to make there be fewer dead bodies in the morgue, only maybe change the color of their skin. Treating it as a mental health issue would - IMO - go further to solving the problem.
Maybe having platforms moderate such speech is a start. But calling it a day with just that is putting the problem in a cabinet and hoping it doesn't come out to play.
[ link to this | view in chronology ]
Re:
"Maybe having platforms moderate such speech is a start. But calling it a day with just that is putting the problem in a cabinet and hoping it doesn't come out to play."
This, right here. Putting hate speech under a law is one of many very bad options which won't accomplish anything other than opening the door for worse to come - usually in the blithe assumption that preventing racists from speaking openly has somehow made them all go away.
Encouraging platforms to abide by ethical standards is a great thing. It's the same argument as we use in bars. Popular bars will ban anyone caught pissing on the dance floor or screaming racial slurs through a bullhorn, providing the main majority a place to go pound a few brewskis in peace and quiet.
For the people with the continence problem there are niche clubs catering to their tastes, as for the people with bullhorns.
This entire debate stems from the people with bullhorns and continence issues insisting that they are entitled an audience even if that audience is highly unwilling.
[ link to this | view in chronology ]
Responding to Mike on a Couple of Points
Collins-Dexter points out Change The Terms. She's not implying that "it's as easy as saying that they have to do more to combat "hate" speech", and is highlighting a coalition that, looking at their FAQ and what they've proposed, is approaching the problem with the complexity it requires.
I don't think that speech-counterspeech is useful in combating hate online. The people espousing hate rarely - if ever - come from a place of intellectual honesty. Odds are that if something running counter to their ideology pops up, they'll just block the user that posted it, and move on.
An algorithmic shift to put counterspeech in the feeds of hateful people wouldn't go unnoticed, either.
The nazis and bigots have co-opted 1984 and love calling anything they can "Orwellian" or "groupthink". I think that things such as algorithm-based counterspeech nudges, shadowbanning, and other behind-the-curtain things just give these people more rhetorical ammo to use. If someone's shadowbanned, figures out that they're shadowbanned, and realizes it's because they sling bigoted bile around, their reaction prolly isn't gonna be something like "Maybe I'm wrong and really should change my opinion on this marginalized group". The more likely reaction is that they're going to be convinced that they're "persec show it to their other bigoted friends & followers, and stay steady in their position or further entrench themselves.
As Stephen said farther up, deplatforming works. A study was conducted that showed that after Reddit banned some of their more nasty subreddits, toxicity went down site-wide.
I think that clear-cut consequences and deplatforming are the best ways to stop hate from spreading. Give users explicit warnings and punishments that include posts, videos, or tweets that they've made; no shadowbanning or algorithm-based counterspeech. If the users clean up their act after that, good. If the users get pissed at the warning and bugger off to some smaller, hateful hole, good. If they don't clean up their act and get banned, either temporarily or permanently, good.
[ link to this | view in chronology ]
Re: Responding to Mike on a Couple of Points
I don't think that speech-counterspeech is useful in combating hate online. The people espousing hate rarely - if ever - come from a place of intellectual honesty. Odds are that if something running counter to their ideology pops up, they'll just block the user that posted it, and move on.
I think we're discussing two separate things. I agree that those fully bought into their ignorance will not care about counter speech. But a big part of this article was the idea that otherwise innocent/naive individuals get drawn deeper and deeper into hate through an algorithm pulling them there.
That's what I'm referring to. I know that a KKK/neo nazi isn't going to change his ways like that (mostly -- though there are some exceptions). But I'm responding to the idea that the naive person who is still forming an identity/opinion gets somehow "radicalized" because of the algorithm. That is where I think counterspeech is quite useful.
As Stephen said farther up, deplatforming works. A study was conducted that showed that after Reddit banned some of their more nasty subreddits, toxicity went down site-wide.
That doesn't surprise me. The question, then, is whether overall toxicity went down... or whether the toxicity formerly on Reddit just went elsewhere where it can fester even worse. I don't know that it did, but it's an important point to explore as well.
I think that clear-cut consequences and deplatforming are the best ways to stop hate from spreading.
If you've read this site for any length of time, you'd know that I have no problems with private platforms kicking off such people. But my concern is whether or not that actually decreases the spread of hate.
[ link to this | view in chronology ]
The “toxicity moving elsewhere” thing obviously happens. By how much is the real question. A couple of jerkoffs going to Twitter after getting nuked from Reddit isn't that much of an issue; several hundred jerkoffs going to Twitter, on the other hand…
[ link to this | view in chronology ]
OK, I'm sorry, but this is really bad.
First of all, is the argument supposed that Brown wouldn't have gone amok in 1999 if Hale's law license had been denied in 1978? That's an insane inference. Maybe Hale would have had more time to recruit.
Second, history does NOT "shows that disproportionately Black and Brown bodies get sacrificed in the name of boundless white speech". History doesn't give out statistics like that.
And history is LONG. Very long indeed. It contains a lot of things.
History contains times and places when abolitionists were "discouraged" from speaking out against slavery, not only by the risk of imprisonment, but by every possible form of "cancellation" and social disapprobation then available. Benjamin Lay was expelled from four Quaker meetings (that's italicized because it's a big deal).
What color were the bodies who suffered most from that limitation on speech, do you think?
History contains Mohandas Gandhi spending two years in prison for sedition. You may be sure that if there'd been a Facebook of the day, it would have been under pressure to de-platform him, because he was dangerous. After all, Gandhi's nonviolence rhetoric aside, "history showed" that giving voice to discontent often led to bloody attempts at revolution. If THAT revolution had gone violent, most of the people dying would have been brown, all right. I guess they were right to try to shut him down?
History contains a time when no "upstanding" US press would publish any defense of homosexuality, nor any positive depiction of it. You're probably young, but some of us are old enough to remember that time.
History contains Martin Luther King having to write from the Birmingham jail.
And history contains a lot of people getting lynched for very non-white speech. Speech that somebody found threatening.
Don't prate at me about how those things were obviously morally different from some privileged, enlightened modern perspective. Tell me what set of institutions and practices would have enabled that speech then, when it was needed... and remember it was needed exactly because all the "nice people" thought such speech lead to behavior that those people truly thought would destroy society. It doesn't matter if they were wrong and you're right; give me institutions that still work when most people are wrong.
A whole shit-ton of bodies, of every color, have been ground up in the name of keeping down "dangerous" speech. And a lot of the "dangerous" speech that's gotten squashed has been in the form of complaints about the mistreatment of those very "bodies" you're so worried about.
How dare you claim the mantle of history of all things?
[ link to this | view in chronology ]
Yes, history is rife with examples of numerous avenues for speech being all but closed to certain ideas — ideas that, over time, were eventually given their due (e.g., same-sex marriage, opposition to the military draft). But you likely wouldn’t have found supporters of the kind of free speech absolutism you espouse now back in those days. (Or do you sincerely believe that, in the 1950s, you would have found widespread support for the idea forcing newspapers to carry neutral-to-positive stories about gay people?)
You’re suggesting, without outright saying, that services such as Facebook should be forced to host all legally protected speech no matter what. What you’re not doing is explaining why you believe Facebook (or Twitter, YouTube, etc.) should be forced to do that even if the admins don’t want to host, say, White supremacist propaganda or ads for the literal torture that is “conversion ‘therapy’ ”. Nor are you explaining why, if such compelled hosting applies to Facebook, that principle shouldn’t apply to services that aren’t as big as Facebook (e.g., Gab). I’d love to know your full stance on the matter, reasoning included.
[ link to this | view in chronology ]
Re:
The "forcing" thing is your own straw man. Don't try to make it my problem to defend what you fantasize to be my position.
What I would have found in the 1950s would have been every other newspaper piling on to any paper that tried to carry anything that wasn't viciously condemnatory. I'd have found huge numbers of subscribers cancelling or threatening to do so, editors being shunned at professional events, reporters having trouble finding new jobs, and politicians "expressing concern" and trying to do anything they could to create a chilling effect. Oh, and once the frenzy got going, I'd probably have seen people trying to root out formerly tolerated "homosexual elements", too.
... which is exactly the sort of thing we're seeing now with "hate speech" (as defined in ways that tend to shift for rhetorical convenience).
And, by the way, this is about suppression, period. The "algorithmic trap" argument is a red herring designed to make the whole program look more palatable. It's pretty stupid to actively go looking for something, and then complain that a computer offers you more of it, but even so, that's not what this is really about. If you go look at, for example, the proposals this very article points to, they make it very clear that the people pushing them won't be happy, and won't go away and stop hassling the platforms, unless certain material is completely suppressed, not just not recommended.
How about if we start by not putting massive outside pressure on Facebook not to carry this or that? There's plenty of room for that without even starting to talk about social pressure to carry anything.
In this thread, nobody that I've noticed except you, and certainly not me, has suggested forcing anybody to carry anything. The subject has not come up. Nor have I seen it get any currency anywhere else. That's simply not on the table, not a realistic threat, and not a legitimate thing for you to bring up.
On the other hand, the idea of outright legally forcing platforms not to carry certain content, has been floated even in the US (where admittedly it won't happen), and actually implemented elsewhere. And, outside of the legal arena, the generalized social pressure has been massive, to the point where you really could start to call it "forcing" them.
So maybe they shouldn't have to be constantly on the defensive, up to and including having to show up in Congress to be scolded, if they do decide to carry some content or another?
What's going on is an influence campaign to push the platforms to disadvantage specific content. I don't happen to like that particular content, but I sure as do hell like having social norms, of all kinds, that are hostile to that kind of pressure.
And, frankly, setting up a culture of suppression like that is a fucking moronic thing for any disadvantaged person or group to want to do. When the current vogue for this particular issue dies down, the norms, structures, and habits it creates will still be there. They will be co-opted by people who operate long-term, and who will use those norms to fuck over either the same people who want this now, or similarly situated future groups. That's how this stuff works.
Watch the news, and you'll find right-wing dictators already picking up language from the "anti-hate-speech" movement to portray themselves and their supporters as victims. I assure you those people are not genuinely concerned for any oppressed person of any color or lack thereof.
Hell, the damned Trump administration is lining up to get some of that juicy victim treatment. Jerkoffs like that can turn this kind of rhetoric around that far and that fast, and people don't think it will backfire?
[ link to this | view in chronology ]
I’ll get to the hate speech points in a sec, but I wanted to kick around this point a bit:
Any argument that rails against Facebook, Twitter, etc. “censoring” or “suppressing” what the admins of those services consider “hate speech” inevitably comes back around to the idea of “they should be forced to carry that speech”. You can’t talk about a “culture of suppression” without talking about what speech is being “suppressed”. You can’t talk about “the free exchange of ideas” without getting into the exchange of heinous, harmful, and outright dangerous ideas. And you can’t argue “maybe Facebook shouldn’t be deleting all this speech” without avoiding the obvious conclusion: If Facebook refuses to host a certain kind of speech and Facebook “shouldn’t” be doing that, only the law could possibly force it to host that speech. (And that assumes such a law would even get past a court challenge on a First Amendment "right of association" argument.)
Now, if you’re not against the notion that the admins of Facebook, Twitter, etc. have every legal, moral, and ethical right to delete whatever speech they want for practically whatever reason they want, fine, cool, I’ve no issue with you. But if you think that none of those admins “should” be able to moderate legally protected speech that they consider offensive, you’re walking right up to the line of “the law should force them to host it” without actually stepping over it. No service — not Facebook, not Twitter, not YouTube or Stormfront or Gab or 4chan or Something Awful or mastodon.social or Soundcloud or MySpace or DeviantArt or…well, literally any platform that accepts user-generated content — should have to host speech against the will of its admins. And no one, including you, has yet presented a solid argument for why the law should force that to happen.
Why, it’s almost as if “hate speech” is an inherently subjective term and no one, including the Supreme Court of the United States, can (or will) come up with a definition that doesn’t somehow violate the First Amendment.
Imagine that~.
Therein lies the point: People who look up videos to keep up with the news aren’t asking to see conspiracy theorists and crackpot “medical experts” talking about demon semen. But they might be exposed to those kinds of channels via preëxisting algorithms. Watching a Fox News video isn’t inherently awful, at least in a vacuum; having your recommendations show you Qanon garbage because enough viewers of that video also watch Qanon garbage, on the other hand…
So what? Facebook has no obligation to give them what they want. And while you might cry about a slippery slope, I’m not going to cry over speech like racial slurs and explicitly anti-gay propaganda getting the boot from Facebook. If and when the matter comes down to otherwise innocuous speech, we can have a chat about the slippery slope. Until then: I don’t care.
If the content is innocuous — everyday bog-standard political speech, for example — yes, they shouldn’t have to defend that. But if we’re talking about White supremacist propaganda or misleading information about COVID-19 that might hurt/kill people or something else along those lines? I’m not going to pity those companies for being asked to defend the reason(s) for their hosting that speech.
Yes, yes, people will game the system once they figure out the rules. So what? That’s been happening long before you and I were ever born, and it’ll be going on long after we’re both dust in the wind. The trick is that rather than keeping those social norms static and unchanging — being conservative, one might say — we update and evolve these norms as circumstances change so we can improve them and progress towards a better society. Change is the only constant in our lives; nothing lasts forever — and that includes those social norms you’re going on about.
Or do you think anti-gay speech is still as acceptable now as it was back in the 1950s?
[ link to this | view in chronology ]
Speech
It is difficult to comprehend how Tech Dirt would even allow the printing of this hysterical, emotional screed against freedom of speech.
Murder is illegal. It always is and always has been.
I can say I hate your guts every day, every minute, and twice on Sunday. If some dipshit decides to murder you because he read me saying I hate your guts? The dipshit is quilty of murder.
Speech is speech. Murder is murder.
The attempt by idiots like this one who wrote this article to somehow connect free speech with violence should be obvious to the editors of Tech Dirt.
There is no sucha thing as hate speech --The US Supreme Court.
Stop supporting this toxic illogical nonsense. Byrdsong was murdered by a murderer. Free speech is sacrosanct. Stop muddying things with this moronic garbage.
[ link to this | view in chronology ]
You’re missing the point because you’re too enamored with your free speech absolutism. I’m not surprised, just disappointed.
The point is this: Services such as YouTube, Google, Amazon, and Facebook tailor algorithms to match user experiences, and those algorithms can be “trained” to serve up hateful speech if they’re not carefully curated by those services. Looking up a Trump speech on YouTube to hear whatever dumb shit that dumbshit has to say is one thing. YouTube using that search to make a personal algorithm that recommends to you channels and videos that lean further to the political right (and use far more explicitly bigoted language) than even Trump because those channels talk about Trump is a whole other ballgame. The question, then, is whether those services have an obligation to prevent their algorithms from recommending hateful speech and misinformation that might radicalize people into hate and violence (or to host that speech in the first place).
[ link to this | view in chronology ]
OMG, What An Informative Article from Brandi Collins-Dexter that I will pass on. Years ago Gil Scott-Heron wrote/ Said( The Revolution Will Not Be Televised) That was in the Year 1971. Now 2020 , 49 Years Later ( The Revolution Is Now Being Televised). Stay Woke. Delores(Cookie) Garmon.
[ link to this | view in chronology ]
Brandi I'm simply proud of you!! Meek and humble beware they change there minds! Langton Hughes⭐️🌙
[ link to this | view in chronology ]