Utah's Horrible, No Good, Very Bad, Terrible, Censorial 'Free Speech' Bill Is A Disaster In The Making
from the that's-not-how-any-of-this-works dept
A month ago, we noted that a bunch of state legislatures were pushing blatantly unconstitutional bills to try to argue that social media websites can't moderate "conservative" views any more. All of these bills are unconstitutional, and most are just silly. The latest one comes from Utah -- and stunningly seems to have legs as it's been moving through the Utah legislative process with some amount of speed over the last week or so.
The bill, SB0228 from state Senator Michael McKell, is so bizarrely wrong on just about everything that it makes Utah look really bad. It's called the "Freedom from Biased Moderation Act" and already that's a pretty clear 1st Amendment problem. Leaving aside the question of whether or not there's any evidence about "anti-conservative bias" in social media moderation (and, just so we're clear: there is no such evidence), if there were moderation decisions that were biased against political viewpoints that would be protected by the 1st Amendment. For good reason.
Courts have made it clear, repeatedly, that the 1st Amendment bars the government from compelling anyone to associate with speech they disagree with. Yet, that's exactly what this bill, and others like it, are seeking to do. Any law that bars the ability to moderate would violate this key part of the 1st Amendment.
But this bill is even more nefarious, in that it couches many of its proposals in ideas that sound reasonable, but they only sound reasonable to people who are totally ignorant of how content moderation works. A key part of the bill is that it requires social media companies to "clearly communicate" the "moderation practices" including "a complete list of potential moderation practices." That's ridiculous, since many cases are unique, and any company doing this stuff has to constantly be responding to changing context, different circumstances, new types of attacks and abuse, and a lot more. This bill seems to presume that every content moderation decision is an obvious paint-by-numbers affair. But that's not how it works at all. Senator McKell should be forced to listen to Radiolab's Post No Evil episode, which details how every time you think there's an easy content moderation rule, you discover a dozen exceptions to it, and you have to keep adjusting the rules. Every damn day.
Then the bill says that a social media company can not "employ inequitable moderation practices." But what does that even mean? Again, every moderation decision is subjective in some way. When we ran 100 content moderation professionals through a simulator with 8 different content moderation decisions, we couldn't get any level of agreement. Because so many of these are judgment calls, and when you have thousands or tens of thousands of moderators making thousands to hundreds of thousands to millions of these judgment calls every day, you're always going to be able to find some "inequitable" results. Not because of "bias" but because of reality.
And how would you even define "inequitable" in this situation anyway? Because context matters a lot. All sorts of factors come into play. Someone in a position of power saying something can be very different from someone not in power saying the exact same thing. Something said after an attack on the Capitol might look very different than something said before an attack on the Capitol. Every situation is different. Demanding the same treatment ignores that the situations will always be different.
Indeed, just recently in discussing Facebook bending over to keep Trumpists happy on the platform we noted that the company was confusing equitable results with equitable policies. Part of the issue is that, right now, you have more utter nonsense and misinformation on the Republican side of the aisle, and if you use "equitable policies" you will get inequitable results. But this bill seems to think that inequitable results must mean inequitable policies. But that's... just wrong.
Next, the bill requires a "notice" from a social media company for any moderation, that includes an explanation of exactly why that content was moderated. Again, I understand why people often think this is a good idea (and there are times when it would be nice for companies to do this, because it is often frustrating when you are moderated and it's not clear why). However, this only works if you're dealing with non-abusive, honest actors. But a significant amount of moderation is to deal with dishonest, abusive actors. And having to explain to each one of them exactly why they're being moderated is not a recipe for better moderation, it's a system for (1) having to litigate every damn moderation decision as the person will complain back "but I didn't do that" even when they very clearly did, and (2) it's training abusive, dishonest trolls how to game the system.
Then, the bill has this odd section where it basically would attempt to force social media companies to hire Facebook's Oversight Board (or some other brand new entity that does the same basic thing) as an independent review board:
A social media corporation shall engage the services of an independent review board to review the social media corporation's content moderation decisions.
While this might be a good idea for some companies to explore, for the government to mandate it is absolutely ridiculous. I thought the Republican Party was about keeping government out of business, not telling them how they have to run their business. It gets even sillier, because the Utah legislature thinks that it gets to dictate what types of people can be on such independent review boards:
The independent review board shall consist of at least 11 members who represent a diverse cross-section of political, religious, racial, generational, and social perspectives.
The social media corporation shall provide on the social media corporation's platform biographies of all of the members of the independent review board.
If this law actually passed, and wasn't thrown out as obviously unconstitutional, I'd love to see the Utah legislature determining if the mandated review board for... let's say OnlyFans, had the proper "religious, generational, and social" diversity...
As an enforcement mechanism, the bill would give the Utah Attorney General the ability to take action against companies deemed violating this law (which, again, would be every company because it sets up a nonsensical standard not based in reality).
This bill is an unconstitutional garbage fire of nonsense, completely disconnected from anything even remotely recognizable as to how content moderation works at social media companies. Utah should reject it, and maybe should get someone to teach Senator McKell some basics about the 1st Amendment and how social media actually works.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: 1st amendment, bias, content moderation, inequity, michael mckell, utah
Companies: facebook, twitter
Reader Comments
The First Word
“Ah, but therein lies the problem: Even if he knew, he wouldn’t care. His party trafficks in lies and grievances and lies about grievances; their feelings matter more than actual facts. The First Amendment preventing compelled association doesn’t matter to Republicans. That’s become painfully clear by how many times they’ve called for the government to compel association between social media services and the bigoted assholes that said companies ban on a regular basis.
Every argument they have on this front is couched in fear — a fear of being “unheard” by an audience bigger than what they have(/deserve). That fear of being “left behind” by broader society drives bills like this one. It exposes their weakness, too: An expressed lack of either knowledge or fucks to give about the First Amendment proves their unfitness for office.
Subscribe: RSS
View by: Time | Thread
Well if you insist...
Personally after reading the article about Facebook bending over backwards for 'conservatives' I would love to see equal treatment by social media, if only to enjoy the screams of outrage and claims of persecution cranked up to 11.
Those trying to force social media to host any and all speech don't want equal treatment they want privileged treatment, exemptions to the rules so they never have to face consequences for their words and actions which is just a tad rich coming from the party of 'small government' and 'personal responsibility'.
Those pushing these wildly unconstitutional bills should just put their money where their mouths are and propose a government run social media platform where all legal speech is welcome because the first amendment is king and moderation outside of that single limit is out of bounds. Then when that turns into a disgusting cesspit people can have a solid and very visible example of what they are trying to turn the current social media platforms into and why that's a bad idea.
[ link to this | view in thread ]
Forcing 'conservative' viewpoints into every conversation can't backfire by convincing non voters they had better get out and vote for the other party, can it?
[ link to this | view in thread ]
Ah, but therein lies the problem: Even if he knew, he wouldn’t care. His party trafficks in lies and grievances and lies about grievances; their feelings matter more than actual facts. The First Amendment preventing compelled association doesn’t matter to Republicans. That’s become painfully clear by how many times they’ve called for the government to compel association between social media services and the bigoted assholes that said companies ban on a regular basis.
Every argument they have on this front is couched in fear — a fear of being “unheard” by an audience bigger than what they have(/deserve). That fear of being “left behind” by broader society drives bills like this one. It exposes their weakness, too: An expressed lack of either knowledge or fucks to give about the First Amendment proves their unfitness for office.
[ link to this | view in thread ]
I’m pretty sure 2020 had a fair bit of that going on.
[ link to this | view in thread ]
radio lab
"I think that they will inevitably fail.
But they have to try.
And I think that we should all be rooting for them."
[ link to this | view in thread ]
Re:
The First Amendment preventing compelled association doesn’t matter to Republicans.
Depends on the circumstances, they seem to be all for being able to refuse association or interaction when it comes to defending their ability to engage in bigotry.
[ link to this | view in thread ]
Re:
Not to mention providing even more reasons they're unfit for office and deserve to be voted out whenever possible.
[ link to this | view in thread ]
Much like the extreme left, the extreme right does not want equality. They want superiority.
Same coin, different sides.
[ link to this | view in thread ]
Re:
bold of you to assume there's an extreme left in the US that wants to rule like kings in the same way the righr wing does, but hey, keep believing in that delusion if it keeps everyone else safe from you
[ link to this | view in thread ]
I'm sorry, sir. Your profile shows you have 20 years of content moderation experience on a directorial level. However, on our diversity survey, you did not check off enough boxes to allow us to hire you.
Now, I'm not supposed to say this, but with our board's current makeup, would you be willing to present as cross-dressing conservative buddhist?
[ link to this | view in thread ]
Re:
You seem to not understand the political spectrum.
[ link to this | view in thread ]
Er...is this bill really so bad?
It specifies that it refers to only corporations that have 20,000,000+ account holders, so it's not going to dump a financial burden on small forums and chatrooms.
Providing a complete list of potential moderation practices doesn't sound wrong either; sure, it would need to be updated regularly with adjustments, exceptions, clarifications, errata, but, that's how common law works off the internet. I don't see how it would be a bad thing to build a library of moderation decisions.
I do like the notice requirements in the bill, too; if r/banned is any indication (watch out, it has its loonies), Reddit has a problem with arbitrary bans and a nonfunctional appeals process, so I'd say rules requiring moderation decisions to be clearly explained and appealable would be a very good step.
Is there something I'm missing about it that makes it a horrible no good very bad bill? A way parts of it could be maliciously interpreted, or something wrong with the enforcement mechanism?
[ link to this | view in thread ]
Talking about a legislature that for decades accepted the business of children being kidnapped from other states to be abused in Utah's troubled teen boarding school-cum-prison industry, contrary to perceptions that society doesn't do that sort of thing any more. Funnily enough, it was McKell who championed intervening, though it took significant negative publicity and celebrity campaigning.
The free speech vs moderation issue does my head in. Wonder though how much of the problem isn't the speech itself, but the extent to which algorithms promoted and connected people with extremist views, and drew in others who otherwise wouldn't have become radicalised. In the old days, even on the internet, people were largely free to engage in that sort of speech, but it remained on the margins. Is that possible with social media? Just feels like they had a huge hand in creating the problem, and are now swinging the pendulum too far in the other direction in trying to quash it. Does Techdirt have any previous writings on that?
[ link to this | view in thread ]
Re:
Yes, its intent is to allow racists, misogynists and bigots to have their day on large social media sites. That is, it is intended to let the inhabitants of Parler, gab et al. back into mainstream social media.
[ link to this | view in thread ]
Re:
Extreme Left in America: "We want tax-funded health care and affordable education."
[ link to this | view in thread ]
Re:
This is a bad example because most of Reddit's moderation is done by non-employees. There are subreddit moderators who are completely independent and there is a smaller team of actual employees who review moderation decisions and enforce site-wide standards. Some subreddits get banned because the independent moderators fail to heed warnings about site-wide violations. That's why Reddit is so inconsistent. And this bill wouldn't resolve that.
[ link to this | view in thread ]
Re:
You... might want to read the article again, because it lays out why it's a terrible idea and addresses your points. If you still don't understand the problems after that I suppose I could try to explain them, though I'm not sure how I could do a better job than the article.
[ link to this | view in thread ]
Re:
It's also a terrible bill because it shouldn't be done at the state level. The internet doesn't just exist in Utah.
You'd also have to be willing for large social media companies to just say Utahns aren't allowed to make accounts because the burden of this law is too great.
[ link to this | view in thread ]
Re:
I mean, you're half right, and by that I mean one of those groups actually does exist in a meaningful fashion...
[ link to this | view in thread ]
Ah, but there's so much more nuance. They're the party of keeping government out of business when it's the businesses of their buddies. They're definitely interested in messing with any business that applies responsibility and consequences to their side.
[ link to this | view in thread ]
Re: Re:
I think you might have to try and explain it to me; from what I can see the bill doesn't compel sites to host or block any specific user content, only that they must publish their terms of use and moderation decisions and apply those rules without discrimination.
[ link to this | view in thread ]
What gives any governmental body any right to tell a private entity what legally protected speech it can and cannot, will and will not, must and must not moderate — and how said entity can/will/must go about moderating that speech? What specific law, statute, or “common law” legal precedent gives the Utah state government any right to, say, tell Twitter whether it can/will/must host racist speech so long as conservatives post it?
[ link to this | view in thread ]
Re: Re: Re:
It's not just posting their moderation decisions. It's requiring an original moderator to write a reason, then they must allow the person to appeal, then they must employ a completely different moderator to review the appeal, then they must write an explanation why the appeal is rejected, then the user can go to the attorney general's office who determines whether the company has violated its own terms of use and then levees a fine.
This is a First Amendment violation because the company can't be compelled to allow speech it disagrees with, even if the disagreement is based on an bias. A social media company quite literally has no legal obligation to be "fair" to one view or the other. The loophole to the whole bill is to include a moderation statement in the terms of use that all moderation decisions are biased and then they can never be seen as violating that.
This is, quite simply, impossible. Moderation is literally a form of discrimination.
I used to do comment moderation appeals for a series of popular websites years ago. It was very time consuming to write back to every person who appealed. I had to use a lot of canned responses because it was too time consuming to write a personalized response to every appellant.
If the company I worked for had to fear getting fined for their moderation practices, they would have just shut down their comment sections rather than risk the loss of money. So the bill (if it could survive a constitutional challenge - which it won't) could lead to shutting down social media, not making it more equitable.
[ link to this | view in thread ]
Re: Re: Re:
Into the fray then...
from what I can see the bill doesn't compel sites to host or block any specific user content
Specific, no, because those attacking 230 are rarely honest enough to own why they're attacking it but they have grokked that specifics are very much not their friend in this argument, hence the usual generality of saying that social media is taking down 'conservative' content without actually defining what that is and pointing to specific examples. That said if you leave the door open to punishing a company for moderation decision you are very much levying a threat against them for doing so, 'encouraging' them to modify their moderation practices, either taking more down or less depending on how the law is written(in this case much, much less).
only that they must publish their terms of use and moderation decisions and apply those rules without discrimination.
Terms of service are already published(you kinda have to agree with them to use the site, not the platform's fault no-one reads them), so that bit is nothing more than empty theater for the gullible unless you/they were talking about specific moderation rules which is also theater but more naive/dishonest, something I'll address that in a second. That said honestly the article addressed this one better than I think I could so I'm just going to copy/paste their stuff and add in some commentary.
A key part of the bill is that it requires social media companies to "clearly communicate" the "moderation practices" including "a complete list of potential moderation practices." That's ridiculous, since many cases are unique, and any company doing this stuff has to constantly be responding to changing context, different circumstances, new types of attacks and abuse, and a lot more. This bill seems to presume that every content moderation decision is an obvious paint-by-numbers affair. But that's not how it works at all. Senator McKell should be forced to listen to Radiolab's Post No Evil episode, which details how every time you think there's an easy content moderation rule, you discover a dozen exceptions to it, and you have to keep adjusting the rules. Every damn day.'
Contrary to what some politicians think/pretend to think moderation is not simple, and that's before you scale things up to tens of millions of users. Context matters and there will always be people looking to game or bypass the rules, which is why it's vital to have flexibility in order to moderate as anything too rigid will block a lot of content that might otherwise be flagged but is fine in context, while still letting bad actors squeak through by claiming that the rules didn't specifically bar what they did so it would be unfair to punish them and leaving the platform constantly having to update their rules to keep up.
Speaking of bad actors...
'Next, the bill requires a "notice" from a social media company for any moderation, that includes an explanation of exactly why that content was moderated. Again, I understand why people often think this is a good idea (and there are times when it would be nice for companies to do this, because it is often frustrating when you are moderated and it's not clear why). However, this only works if you're dealing with non-abusive, honest actors. But a significant amount of moderation is to deal with dishonest, abusive actors. And having to explain to each one of them exactly why they're being moderated is not a recipe for better moderation, it's a system for (1) having to litigate every damn moderation decision as the person will complain back "but I didn't do that" even when they very clearly did, and (2) it's training abusive, dishonest trolls how to game the system.
This one really shouldn't need more explanation than what was in the article honestly. Requiring a platform to justify every gorram moderation decision is not only going to be a massive pain in the ass and 'encourage' platforms to moderate vastly less it assumes good faith on the part of the moderated which is frankly naive in the extreme. No-one's going to admit that yeah, they knew they were breaking the rules but did X anyway when they know they can lie and escape punishment or face nothing more than a slap on the wrist and empty warning, and if they're feeling vindictive there would always be the option to claim unfair discrimination, an accusation that would carry very real risks to the platform.
Adding to the problem by forcing platforms to explain exactly what triggered the penalty you're practically handing trolls and other bad actors a cheat-sheet on what specific acts/words to avoid, as they can just make minor changes and you're back to square one explaining that yes, that word/act counts as a violation as well even if they assure you that they were just trying to follow the rules by not using the specific word you called them on the last time.
You've also got the wildly insane bit about mandatory 'oversight' boards and what exactly they will be staffed with(and oh the conflicts of interest and deadlocks I could see in those...), and I would hope I don't have to explain how the government telling a company not only that they must moderate in a certain way but they are required to pay to put together a group to double-check their work is problematic from a first amendment perspective.
[ link to this | view in thread ]
Re: Re: Re: Re:
So...basically, there are bad actors, spammers and trolls all over the place, who would use and abuse technicalities and the appeals system to the point that moderators couldn't effectively enforce any rules?
Fair enough. I've had trouble with abusive moderators in the past so I was looking at this bill from the perspective of keeping them in line, but if it's going to enable the social-media-equivalent of vexatious litigators...yeah. Thanks for explaining.
[ link to this | view in thread ]
Re: Re:
"Depends on the circumstances, they seem to be all for being able to refuse association or interaction..."
Of course they do. The republican whining about "cancel culture" and "Being silenced" is all about desperate fear - of society condemning them the same way society used to condemn being black, liberal, gay, or trans.
"Of course you have the right to be the way you are. If you don't tell then I shan't ask! But If you do tell you have no complaints coming when you get fired from your job, denied service, are thrown out of the bar or kicked out of the army! We can't compel people to like you!"
If the above sounds familiar then it would be because that's the way republicans justified black people, gay people, liberals, leftists, damyankees and anyone else who was different being ostracized by society back in darker times, when societal judgment was still mainly about what you were rather than what you did.
Now that society has progressed they know damn well what to expect and are being whiny and entitled snowflakes about it.
In their own words; "Fuck their feelings!"
There's just one main difference. They, at least, can change. They don't have to be repugnant assholes. They don't have to be bigots. Kicking the grievance addiction might be rough but it's doable. They weren't born repulsive and intolerant fuckwits.
But that's not going to happen while the rest of US decides to reward them all by giving them yet another free pass here.
Those 73 million americans who kept voting for white supremacy and authoritarianism need to be run into the wilderness by the saner majority and kept away from civilization until they learn to live with other people.
Tolerance is an exercise in reciprocity and those people are all too happy to deny others that which they demand for themselves. They have no place among those who choose to abide by the social contract.
[ link to this | view in thread ]
Re:
"Much like the extreme left, the extreme right does not want equality. "
What extreme left, kemo sabe? The only left-wingers I know of as extremist as US right-wingers today would be the long-dead Rote Armee Fraktion in Germany or other DDR-backed terrorist groups of the 70's.
What too many americans today call "leftists" are, in reality, centrist-right. That should tell you volumes about just how far out on the brink you guys are...
[ link to this | view in thread ]
Re:
And therefore the gop must stop those people from voting, it's their only avenue.
They even admit it, they can not win if all eligible voters exercise their right.
[ link to this | view in thread ]
Re: Re: Re:
"I think you might have to try and explain it to me;"
I think you will not understand anything that you do not already agree with, closed minds are like that.
If it is contrary to my beliefs then it is wrong. - you probably
".. doesn't compel sites to host or block any specific user content ..... apply those rules without discrimination"
Perhaps you could explain how this works. A site is not compelled to host or block but they are compelled to not discriminate, how do you do that? I can imagine where that would end up, can you explain why it would not cause huge problems.
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
There is also the problem of how do you run a large social media site if every moderation decision requires a manual review?
[ link to this | view in thread ]
Re:
One side is in positions of power across all levels of government, cemented into place by gerrymandering and voter suppression so they can hold power without ever getting the majority of the votes. The other exists somewhere maybe in absolutely miniscule numbers, holding no power whatsoever and getting shouted down on forums by others on the same side of the political spectrum for their tankie bulls**t because they make it harder to get even the slightest scrap of reform done.
One extreme has been embraced by their party and is making their lunacy into law. The other extreme exists on the internet if you look hard enough at the comments section of anything chapo related... Yeah, they're both absolutely the same.
[ link to this | view in thread ]
Re:
Note they can only say "feel" caused it. QAnon earlier felt that a pizza shop was trafficking child sex slaves in its nonexistent basement. Feelings are used iessentially as a rationalization for carte blanch. Algorithms are a scapegoat ascribed mystical power by the ignorant to justify their desire for arbitrary punishment.
It is ignorant as suggesting "possession of chemistry" should be illegal because it covers damn near everything! No matter how you choose to order posts it is an algorithim.
Radicalization has far more to do with perceived grievances, seeking connections, offical responses,nand dehumanization than mere exposure of views let alone algorithims. Despite what every terrorist who publishes a manifesto thinks their views aren't persuasive and only need exposure to change the world - that they had no traction before is a major hint.
[ link to this | view in thread ]
Something something we can't use the really good filter that removes Nazi's comments from platforms b/c it removes to many Republicans...
I think I found the problem.
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
"I've had trouble with abusive moderators in the past so I was looking at this bill from the perspective of keeping them in line, but if it's going to enable the social-media-equivalent of vexatious litigators...yeah."
That's more or less it;
Rando commenter #1: "It's all the fault of dem <N-word> and <ze jews>! Antifa stole the Election! Hillary eats babies! But Obama! Raaah!"
Moderator: "oooh--kaay...? Blocking this one!"
Utah Court: "Hear ye! Hear ye! Court is now in session. Moderator, kindly prove to the court that Hillary does not, in fact, eat babies?"
[ link to this | view in thread ]
Just more of Republicans thinking the US was made by the GOP and for the GOP and that no one else matters or has any rights.
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
Glad to have cleared it up for you.
[ link to this | view in thread ]
Silly statements
[ link to this | view in thread ]
Re: Re:
Social media and algorithms certainly aren't the only cause and shouldn't be "scapegoats" as you put it, but they are some of the major societal changes that happen to have arisen in line with the outcomes we've been seeing. Of course there are deeper underlying drivers as well (like the effects of economic policies of the past several decades) but there seems to be a more direct cause-and-effect relationship between social media and certain events.
Did end up looking for past articles on Techdirt dealing with the topic. Found one in relation to YouTube, which basically concluded that, if anything, it promoted centrist views and pushed people toward the mainstream. The study was done however after YouTube had already started cracking down on fake news etc., and doesn't take into account which videos people actually focus on - consider for example the tendency of people to sometimes be more affected by a single negative comment in a sea of 99 positive ones. What's the impact of one radical video among 99 "mainstream, centrist" ones? Do they function like forbidden fruit?
On the internet of old, by and large, you had to go out of your way to find stuff - it took more initiative on the part of the individual web surfer, and those seeking radical or extreme views probably ended up at some newsgroup or forum where they bounced ideas around with people on the other side of the world. On social media, however, it gets (or did get) pushed to people, even if it's the exception at first, and if they start down that rabbit hole, it comes with a culture of people forming real-world connections and organising real-world events. That's also to point out there's significantly more to the social media phenomenon than the YouTube study covered.
Anyway this is admittedly "seems to me" kind of stuff and it'd be cool to see more rigorous perspectives.
[ link to this | view in thread ]
Re: Re: Re:
When it comes to politics and extremism, people have always managed to get organized. What does tend to correlate is political extremism and social inequality. Wikipedia has a list of peasant revolts. These are actual revolts, and the vast majority pre-date the phone and systems, and even mail systems for the masses.
Now while the Internet speeds up communications, and the finding of like minded people, it also makes it much easier for the authorities to keep an eye of what is going on.
Blame the Internet is politicians blaming others for their failures, and the extremism inside US poltics will fuel violent extremism. Trumo has done more to drive people to extremism than the Internet.
[ link to this | view in thread ]
It just passed the legislature
It now goes to the governor, who will hopefully veto it.
[ link to this | view in thread ]
Re: It just passed the legislature
Of course it did, I'm so very glad that there's no more pressing concerns they could be focused on, leaving them the time to spare on this unconstitutional theater to rile up the gullible.
[ link to this | view in thread ]
Just vetoed!
The governor just vetoed it. Unfortunately, Senator McKell says he'll try to revise it and sponsor it again in May.
[ link to this | view in thread ]