Bad Section 230 Bills Come From Both Sides Of The Aisle: Schakowsky/Castor Bill Would Be A Disaster For The Open Internet
from the that's-not-how-any-of-this-works dept
It truly is stunning how every single bill that attempts to reform Section 230 appears to be written without any intention of ever understanding how the internet or content moderation works in actual practice. We've highlighted tons of Republican-led bills that tend to try to force websites to host more content, not realizing how (1) unconstitutional that is and (2) how it will make the internet into a giant garbage fire. On the Democratic side, the focus seems to be much more on forcing companies to takedown constitutionally protected speech, which similarly (1) raises serious constitutional issues and (2) will lead to massive over-censorship of perfectly legal speech just to avoid liability.
The latest bill of the latter kind comes from Reps. Jan Schakowsky and Rep. Kathy Castor. Schakowsky has been saying for a while now that she was going to introduce this kind of bill to browbeat internet companies into being a lot more proactive in taking down speech she dislikes. The bill, called the Online Consumer Protection Act has now been introduced and it seems clear that this bill was written without ever conferring with anyone with any experience in running a website. It's the kind of thing one writes when you've just come across the problem, but don't think it's worth talking to anyone to understand how things really work. It's also very much a kind of "something must be done, this is something, we should do this" kind of bill that shows up way too often these days.
The premise of the bill is that websites "don't have accountability to consumers" for the content posted by users, and that they need to be forced to have more accountability. Of course, this leaves out the kind of basic fact that if "consumers" are treated badly, they will go elsewhere, so of course every website has some accountability to consumers: it's that if they're bad at it, they will lose users, advertisers, sellers, buyers, whatever. But, that's apparently not good enough for the "we must do something" crowd.
At best the Online Consumer Protection Act will create a massive amount of silly busywork and paperwork for basically any website. At worst, it will create a liability deathtrap for many sites. In some ways it's modeled after the idiotic policy we have regarding privacy policies. Almost exactly a decade ago we explained why the entire idea of a privacy policy is dumb. Various laws require websites to post privacy policies, which no one reads, in part because it would be impossible to read them all. The only way a site gets in trouble is by not following its privacy policy. Thus, the incentives are to craft a very broad privacy policy that gives sites leeway -- meaning they have less incentive to actually create more stringent privacy protections.
The OCPA basically takes the same approach, but... for "content moderation" policies. It requires basically every website to post one:
Each social media platform or online marketplace shall establish, maintain, and make publicly available at all times and in a machine-readable format, terms of service in a manner that is clear, easily understood, and written in plain and concise language.
That terms of service will require a bunch of pointless things, including a "consumer protection policy" which has to include the following:
FOR SOCIAL MEDIA PLATFORMS.—For social media platforms, the consumer protection policy required by subsection (a) shall include—
(A) a description of the content and behavior permitted or prohibited on its service both by the platform and by users;
(B) whether content may be blocked, removed, or modified, or if service to users may be terminated and the grounds upon which such actions will be taken;
(C) whether a person can request that content be blocked, removed, or modified, or that a user’s service be terminated, and how to make such a request;
(D) a description of how a user will be notified of and can respond to a request that his or her content be blocked, removed, or modified, or service be terminated, if such actions are taken;
(E) how a person can appeal a decision to block, remove, or modify content, allow content to remain, or terminate or not terminate service to a user, if such actions are taken; and
(F) any other topic the Commission deems appropriate.
It's difficult to look at that list and not laugh and wonder if whoever came up with it has ever been anywhere near a content moderation or trust & safety team, because that's not how any of this works. Trust & Safety is an ongoing effort of constantly needing to adjust and change with the times, and there is no possible policy that can cover all cases. Can whoever wrote this bill listen to the excellent Radiolab episode about content moderation and think through how that process would have played out under this bill? If every time you change the policies to cover a new case you have to publicly update your already ridiculously complex policies -- while the new requirements be that those same policies are "clear, easily understood, and written in plain and concise language" -- you've created an impossible demand.
Hell, someone should turn this around and push it back on Congress first. Hey, Congress, can you restate the US civil and criminal code such that it is "clear, easily understood, and written in plain and concise language?" How about we try that first before demanding that private companies be forced to do the same for their ever changing policies as well?
Honestly, requiring all of this be in a policy is just begging angry Trumpists to sue websites saying they didn't live up to the promises made in their policies. We see those lawsuits today, but they're kicked out of court under Section 230... but Schakowsky's bill says that this part is now exempted from 230. It's bizarre to see a Democratic bill that will lead to more lawsuits from pissed off Trumpists who have been removed, but that's what this bill will do.
Also, what "problem" does this bill actually solve? From the way the bill is framed, it seems like Schakowsky wants to make it easier for people to complain about content and to get the site to review it. But every social media company already does that. How does this help, other than put the sites at risk of liability for slipping up somewhere?
The bill then has separate requirements for "online marketplaces" which again suggest literally zero knowledge or experience with that space:
FOR ONLINE MARKETPLACES.—For online marketplaces, the consumer protection policy required by subsection (a) shall include—
(A) a description of the products, product descriptions, and marketing material, allowed or disallowed on the marketplace;
(B) whether a product, product descriptions, and marketing material may be blocked, removed, or modified, or if service to a user may be terminated and the grounds upon which such actions will be taken;
(C) whether users will be notified of products that have been recalled or are dangerous, and how they will be notified;
(D) for users—(i) whether a user can report suspected fraud, deception, dangerous products, or violations of the online marketplace’s terms of service, and how to make such report;
(ii) whether a user who submitted a report will be notified of whether action was taken as a result of the report, the action that was taken and the reason why action was taken or not taken, and how the user will be notified;
(iii) how to appeal the result of a report; and
(iv) under what circumstances a user is entitled to refund, repair, or other remedy and the remedy to which the user may be entitled, how the user will be notified of such entitlement, and how the user may claim such remedy; and
(i) how sellers are notified of a report by a user or a violation of the terms of service or consumer protection policy;
(ii) how to contest a report by a user;
(iii) how a seller who is the subject of a report will be notified of what action will be or must be taken as a result of the report and the justification for such action;
(iv) how to appeal a decision of the online marketplace to take an action in response to a user report or for a violation of the terms of service or consumer protection policy; and
(v) the policy regarding refunds, repairs, replacements, or other remedies as a result of a user report or a violation of the terms of service or consumer protection policy.
Honestly, this reminds me a lot of Josh Hawley's bills, in that it seems that both Hawley and Schakowsky want to appoint themselves product manager for the internet. All of the things listed above are the kinds of things that most companies do already because you need to do it that way. But it's also the kind of thing that has evolved over time as new and different challenges arise, and locking the specifics into law does not take into account that very basic reality. It also doesn't take into account that different companies might not fit into this exact paradigm, but under this bill will be required to act like they do. I can't see how that's at all helpful.
And, it gets worse. It will create a kind of politburo for how all internet websites must be run:
Not later than 180 days after the date of the enactment of this Act, the Commission shall conduct a study to determine the most effective method of communicating common consumer protection practices in short-form consumer disclosure statements or graphic icons that disclose the consumer protection and content moderation practices of social media platforms and online marketplaces. The Commission shall submit a report to the Committee on Energy and Commerce of the House of Representatives and the Committee on Commerce, Science, and Transportation of the Senate with the results of the study. The report shall also be made publicly available on the website of the Commission.
Yeah, because nothing works so well as having a government commission jump in and determine the "best" way to do things in a rapidly evolving market.
Also, um, if the government needs to create a commission to tell it what those best practices are why is it regulating how companies have to act before the commission has even done its job?
There are a bunch more requirements in the bill, but all of them are nitty gritty things about how companies create policies and implement them -- something that companies are constantly changing, because the world (and the threats and attacks!) is constantly changing as well. This bill is written by people who seem to think that the internet -- and bad actors on the internet -- are a static phenomena. And that's just wrong.
Also, there's a ton of paperwork for nearly every company with a website, including idiotic and pointless requirements that are busywork, with the threat of legal liability attached! Fun!
FILING REQUIREMENTS.—Each social media platform or online marketplace that either has annual revenue in excess of $250,000 in the prior year or that has more than 10,000 monthly active users on average in the prior year, shall be required to submit to the Commission, on an annual basis, a filing that includes—
(A) a detailed and granular description of each of the requirements in section 2 and this section;
(B) the name and contact information of the consumer protection officer required under subsection (b)(4); and
(C) a description of any material changes in the consumer protection program or the terms of service since the most recent prior disclosure to the Commission(2) OFFICER CERTIFICATION.—For each entity that submits an annual filing under paragraph (1), the entity’s principal executive officer and the consumer protection officer required under subsection (b)(4), shall be required to certify in each such annual filing that—
(A) the signing officer has reviewed the filing;
(B) based on such officer’s knowledge, the filing does not contain any untrue statement of a material fact or omit to state a material fact necessary to make the statements, in light of the circumstances under which such statements were made, not misleading;
(C) based on such officer’s knowledge, the filing fairly presents in all material respects the consumer protection practices of the social media platform or online marketplace; and
(D) the signing consumer protection officer—(i) is responsible for establishing and maintaining safeguards and controls to protect consumers and administer the consumer protection program; and
(ii) has provided all material conclusions about the effectiveness of such safeguards and controls.
So... uh, I need to hire a "consumer protection officer" for Techdirt now? And spend a few thousand dollars every year to have lawyers (and, most likely a new bunch of "compliance consultants" review this totally pointless statement I'll need to sign each year? For what purpose?
The bill also makes sure that our courts are flooded with bogus claims from "wronged" individuals thanks to its private right of action. It also, on top of everything else, exempts various state consumer protection laws from Section 230. That's buried in the bill but is a huge fucking deal. We've talked about this for years, as various state Attorneys General have been demanding it. But that's because those state AGs have a very long history of abusing state "consumer protection" laws to effectively shake down companies. A decade ago we wrote a definitive version of this in watching dozens of state attorneys general attack Topix, with no legal basis, because they didn't like how the company moderated its site. They were blocked from doing anything serious because of Section 230.
Under this bill, that will change.
And we've seen just how dangerous that can be. Remember how Mississippi Attorney General Jim Hood demanded all sorts of information from Google, claiming that the company was responsible for anything bad found online? It later came out (via the Sony Pictures hack) that the entire episode was actually funded by the MPAA, with Hood's legal demands written by the MPAA's lawyers, as part of Hollywood explicit plan to saddle Google with extra legal costs.
Schakowsky's bill would make that kind of corruption an every day occurrence.
And, again, the big companies can handle this. They already do almost everything listed anyway. All this really does is saddle tons of tiny companies (earning more than $250k a year?!?) with ridiculous and overly burdensome compliance costs, which open them up to not just the FTC going after them, but any state attorney general, or any individual who feels wronged by the rules.
The definitions in the bill are so broad that it would cover a ton of websites. Under my reading, it's possible that Techdirt itself qualifies as a "social media platform" because we have comments. This is yet another garbage bill from someone who appears to have no knowledge or experience how any of this works in practice, but is quite sure that if everyone just did things the way she wanted, magically good stuff would happen. It's ridiculous.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: busywork, consumer protection, content moderation, ftc, jan schakowsky, kathy castor, paperwork, private right of action, section 230, terms of service
Reader Comments
Subscribe: RSS
View by: Time | Thread
Damn, I used a perfectly good joke last week that would’ve fit in right here. Curse my impulse to take cheap potshots at the troll brigade!
…but seriously, this bill is shit and no one with any in-practice knowledge of moderation should defend it.
[ link to this | view in chronology ]
Basically, politicians don't want just anyone with access to a keyboard to have a voice in this country. (Translation: they don't want to enable free speech any more than they have to)
[ link to this | view in chronology ]
No, they don’t want anyone who can access a keyboard to have a voice that can be easily heard. Twitter and Facebook allow for that thanks to sharing protocols (e.g., retweets). You can still run a blog, but unless you’re well-connected on social media, nobody will really notice it.
[ link to this | view in chronology ]
Silencing the voices of the people
Exactly, this. The conversation was much easier to control when news agencies had to expressly approve anything that was said.
Nowadays, people know that police officers shoot innocent people, that legislators lie all the time and that corporations are not sufficiently regulated to prevent their products from injuring or killing people.
When the public becomes aware of a thing, there's pressure to fix it. And if the people demanded all the wrong things got fixed, then where'd we be?
[ link to this | view in chronology ]
"The report shall also be made publicly available on the website of the Commission."
That website better have a top-notch consumer protection policy....
[ link to this | view in chronology ]
Just once, I'd like to see one of these bills that offers even a minimal justification for the numbers it throws about. Annual revenue of more than $250,000? More than 10,000 monthly active users? Why draw the lines there, instead of anywhere else? For that matter, let's see an explanation of what an "active user" is. Are lurkers who create an account but don't post "active"? How do we count anonymous contributors, like the people who make edits to Wikipedia pages without creating accounts and whose edits are logged by their IP address? One IP address could be many people, and one person could use many IP addresses....
[ link to this | view in chronology ]
Can they handle the hundreds of cases in different jurisdictions that this would enable, especially if they are a co-ordinated attack and timed to be concurrent? Could they find enough lawyers to do so?
[ link to this | view in chronology ]
FTFY
[ link to this | view in chronology ]
How about Congress just fingers itself lol
[ link to this | view in chronology ]
The better you can describe why the answer to "what is best" is "it depends" is by describing why it depends. Collect all that describing in one place, then pass it out to those who make the laws, and each incoming class.
[ link to this | view in chronology ]
Re:
That would require effort and they would prefer the companies carry the water.
[ link to this | view in chronology ]
suggest they think of
Standing up in congress to get something passed.
Having everyone in the chamber rattling on about this and that and not paying attention to what you have to say.
Then the lead person Bangs a gavel, and Gathers attention to the SUBJECT, you and your bill.
Now if there ISNT/wasnt control over the congress and keeping things going, you might as well be a group of 2 year olds, talking about anything and everything all at once.
And I can see the interns now, running around throwing papers and not paying attention AT ALL.
What do you want? The problems you have tends to be Figuring out WHAT you dont want. Not what you Would like.
But lets pass the bill, and let everyone SPAM the net. But that includes letting us spam movies and music all over the place also.
[ link to this | view in chronology ]
and they wont stop coming because the pricks that keep writing them dont understand 230 and wont understand 230 until it's gone, then they'll be the first ones to miss it, the first ones to need it and the first ones to deny they had anything to do with dismantaling it!!
[ link to this | view in chronology ]
Job Opening
I would like to start out by thanking you for the opportunity to apply for the Consumer Protection Officer position at Techdirt.
As you probably already know, I have been a loyal Techdirt user for 12 years, and I am sure you can find anything that you need in the very detailed profile you have maintained on me, as such I am supremely qualified for this position.
But more to the point having been a System Administrator and Programmer for 42 years I am more than ready to outsource the production of a script that will combine input from the appropriate Reddit communities and Facebook groups combined into a dense word salad report that will satisfy absolutely no one.
I have heard that you pay very generously and have excellent people skills and benefits, you are also very good looking and the smartest person in the room.
I believe that now that we have that settled, you can now get behind this awesome opportunity to improve the internet as we know it.
Powertoaster
[ link to this | view in chronology ]
'Every expert says I'm wrong, the problem must be them.'
This is yet another garbage bill from someone who appears to have no knowledge or experience how any of this works in practice, but is quite sure that if everyone just did things the way she wanted, magically good stuff would happen. It's ridiculous.
It is difficult to get a person to understand something when their position/argument depends upon their (real or feigned) ignorance of it.
[ link to this | view in chronology ]
Two things
Told you this would happen
and
The only way to fix 230 is to have companies declare if they are providers or publishers.
[ link to this | view in chronology ]
Re: Two things
That would divide web sites into two camps, cess pits, and strictly edited magazines. Either way, you will be effectively silenced.
[ link to this | view in chronology ]
Re: Two things
The only way to fix 230 is to have companies declare if they are providers or publishers.
That would fix absolutely nothing. It honestly is dumber than any of the bills currently being pushed, including this one.
How do you think that would work? A site declares itself a "platform" and then can't moderate at all? Great, then you get a garbage dump of a site filled with spam, porn, and abuse. Useless garbage.
A site declares itself a "publisher" and now faces liability? Great, then you get a site that locks everything down, which few people can use, and where tons of important speech gets stifled.
Your "solution" is literally the worst of all worlds and shows less than no understanding of the situation. It shows that you so misunderstand things that your solution would inherently be dumber than pretty much any other solution.
This is an unfathomly stupid idea.
[ link to this | view in chronology ]
Re: Re: Two things
Go say that to Facebook or Twitter. Social media "providers" are just that. Communication providers meant to facilitate communication between users, with ad-placements used for generating revenue. It is by definition useless garbage. No different than AT&T or the Post Office, and just as much as an essential service. Having certain guarantees enforced by law, like no reading of the mail for the post office, or that service is available to everyone regardless of political views is a good thing.
Do you lock everything down? Yes, under this type of reform you might due to the legal burden. But a lockdown here on this site, isn't going to prohibit people from talking about your stories on more open communication "providers". Nor would it "stifle important speech" Hell, you encourage it now. Techdirt isn't the only place where people can talk about Techdirt's stories, and the legislation doesn't change that. It just introduces additional requirements for those sites who wish to straddle the lines between "publisher" and "provider." I.e. Provide a place for user-generated content along side first party content and treat both as equals.
Actually your solution is the dumb one. Creating a well defined label for sites that primarily serve user-generated content makes it much harder for a would-be abuser to take said sites to court for speech protected by the First Amendment. Creating obvious labels so that even an idiot can realize "This site is HOT." I.e. "This site contains user-generated content that is not to be taken for factual reporting." Is a good thing. Especially when the internet and the IT industry in general has a massive hard-on for being both overly helpful to end-users and not expecting them to know or do anything related to their devices or personal security. While also demanding full situational awareness online and demanding that everyone else do the fact-checking themselves. We've seen how that policy works out: It resulted in a failed coup attempt committed by idiots who failed to distinguish fact from fiction. In short, you are the one who has no understanding of the expectations that have been laid out for the general public. Nor what needs to be done to correct it while maintaining as much freedom of expression as possible given those constraints.
[ link to this | view in chronology ]
Re: Re: Re: Two things
I see you are redefining things to fit your narrative. In reality, a Social Media Providers are actually firms providing expertise in how to get the most out of social media for companies. Regardless, your argument is by definition useless garbage (see how easy it is to claim something is useless garbage, which makes it useless garbage).
Which breaks the first amendment into little pieces. You just removed the right to free association.
Uhm, we already have that without putting labels on sites, it's 47 U.S. Code §230.
So, gutting 230 is the only way to get idiots educated because we have politicians and a bunch of traditional media-outlets that has been lying their asses off repeatedly. Nothing in what you say will actually fix the cause, people in power who lie without consequence plus the proliferation of lies online. And those who actually tried to point out the lies or moderate them away where the target for even more lies on how section 230 enabled them to "censor" or fact check people with impunity since it "infringed free speech" and how biased they where against conservatives.
Seriously, your argument is all over the place and doesn't make much sense.
[ link to this | view in chronology ]
Re: Re: Re: Two things
Go say that to Facebook or Twitter. Social media "providers" are just that. Communication providers meant to facilitate communication between users, with ad-placements used for generating revenue. It is by definition useless garbage. No different than AT&T or the Post Office, and just as much as an essential service.
Nope. Extremely different. AT&T and the Post Office are conduits. They send information from this place to that place. And that's it. Social media hosts content. Meaning it's always there.
That's ENTIRELY different.
Do you lock everything down? Yes, under this type of reform you might due to the legal burden. But a lockdown here on this site, isn't going to prohibit people from talking about your stories on more open communication "providers".
Accept, as noted, those "more open" providers are cesspools of garbage. Congrats, you've turned the internet into 8kun.
You're truly clueless.
[ link to this | view in chronology ]
Re: Re: Re: Two things
There, fixed that for you.
[ link to this | view in chronology ]
Yes or no: Do you believe a social interaction network designed primarily by and for Black people should have to host white supremacist propaganda posted by an actual white supremacist? Keep in mind that such speech is legally protected and white supremacy is a political ideology. And yes, the question remains the same even if you flip the racial identities.
But Techdirt would become the one place where people couldn’t talk about Techdirt articles — often with the writers of those stories themselves. I can’t think of any way to justify that outcome as a net positive.
Other than a desire to sue platforms into the ground over third-party speech, for what reason should the law make that distinction?
Section 230 already hamstrings the ability of an abusive asshole to sue sites into the ground over third-party speech.
I hope this isn’t an attempt to lay the blame for the Insurrection of January 6th at the feet of Facebook and Twitter. They didn’t tell people to go storm the Capitol building. Those companies played a role in spreading the mis- and disinformation that led to people believing Old 45’s Big Lie, sure. But saying they’re the reason people broke into the Capitol is ridiculous.
Expectations are disappointments planned in advance. Get over it.
Yes or no: Do you believe the government should have the legal right to compel any privately owned interactive web service into hosting legally protected speech that the owners/operators of said service don’t want to host? Keep in mind that speech such as racial slurs, anti-queer propaganda, and Limp Bizkit song lyrics are all protected speech.
[ link to this | view in chronology ]
Re: Two things
That stupid people come up with stupid ideas? That happens the whole time, but the really stupid ones are those who is cheering it on.
Oh, tell us how this will fix things. Can you do that? Or are you just repeating the dishonest mantra of the disgruntled assholes?
Btw, the bill being discussed, it would more or less disappear almost all "contemporary conservative views" which is a plus, although most minor sites would be dead at the same time which is a negative. Oh, it's also very unconstitutional since it explicitly targets speech.
[ link to this | view in chronology ]
Did you think through this proposal before you posted it, or did you think everyone would agree with you so you wouldn’t need to think it through?
[ link to this | view in chronology ]
Re:
Same as the person who crafted this bill.
[ link to this | view in chronology ]
It's amazing how stupid politicians do not understand section
230 and seem to want to weaken it and pile useless busy work on websites
The Web is constantly changing and moderation practices change weekly
Making it easy for random persons to sue any website s a
bad idea
It can only result in making Google , Facebook stronger
While reducing competition from the startups and small websites
and reducing venues for free speech
A service that has to post a detailed list of moderation practices is providing a guide to trolls
Section 230 is basically a shield and providing 1st amendment rights for websites blogs forums
by protecting them from random legal action by trolls
[ link to this | view in chronology ]
Way to steal my joke!
In all seriousness, I do not doubt that Techdirt would fall under the definition of a "social media site" in this bill. I'd be concerned that every other troll and copyright maximalist that is flagged and/or caught by the spam filter would then have a cause of action against Techdirt.
But beyond that, the fact that the rules would be enforced by the FTC is also concerning. The FTC defines a website under a different law (COPPA) to include individual YouTube channels and not just YouTube itself. Under this law, would any YouTuber that moderates their comment section (i.e., EVERY YouTuber), would that open the door to pointless litigation between commenters and YouTubers under this law? I could be wrong, but I normally wouldn't think COPPA affected individual YouTube channels until the FTC said otherwise.
Another question worth asking the sponsors of this bill is... Can the FTC even handle this? A lot of anti-section 230 arguments make more sense as pro-Net neutrality arguments. Yet, when the FCC got rid of net neutrality in 2017, one common argument I've seen from defenders of the repeal is that the FTC would be able to handle the complaints about net neutrality instead of the FCC. Yet, even before it was voted on, the FTC came out and said they couldn't handle such enforcement. And considering that it's a much taller order to police websites as opposed to internet service providers, I doubt the FTC would be able to handle this, either.
And to answer your question about what problem it solves, it appears that at this point, Section 230 is a problem simply because it's a problem. Until we address what it is we're even trying to do, we shouldn't be trying to reform section 230 in every which way possible. Regardless of how some conservatives and even a few liberals might say, Section 230 is working well, and as the old saying goes, "If it ain't broke, don't fix it!"
[ link to this | view in chronology ]
Re:
Agreed.
[ link to this | view in chronology ]
It wouldn't create a new commission. The commission it refers to is the FTC. This is clarified in the definitions
[ link to this | view in chronology ]
Contributions?
What are the chances many of these bills are designed to do nothing other than ensure that the campaign contributions from the big internet companies keep coming in? We know legislators will sometimes announce they're "considering" introducing legislation on a topic just to whip the lobbyists into a frenzy.
[ link to this | view in chronology ]