How Internet Filtering Blocks All Sorts Of Legit Info
from the here-we-go-again dept
This has been pointed out before, but it's received less attention recently, despite the rise in interest in internet filters: the filters don't work very well. In fact, Mitch Wagner went looking and found all sorts of examples of internet filters in schools blocking all sorts of legit info:The Canadian National History Society was forced to change the name of its magazine, The Beaver, founded in 1920, because the name of the magazine caused it to be blocked by Internet filters.Again, some of these stories appear to be old ones, which the filtering industry insists have been fixed, but these kinds of filters will always create false positives.
One teacher wanted to show students some pictures that would illustrate the effects of atomic testing. "However when I went to bring the wikipedia page up at school during class, it was blocked by our internet filter, BESS. The name of the islands? 'Bikini Atoll,'" said Doug Johnson, quoting the teacher. Johnson, a director of media and technology at a Minnesota school district, put out a call in July for stories about how Internet filtering hobbles education, and got an earful. ("Censorship by Omission")
Johnson also shares a message from another teacher, describing how a school's systems security manager decided to block the social bookmarking site delicous.com. The reason? You can use the site to search for porn....
The problem goes back for years. A filter blocked the Web site of former House Majoirty Leader Richard Armey because it detected the word "dick," according to "Internet Filters, a public policy report," a 2001 study from the Brennan Center of Justice. Other software blocked the Declaration of Independence, Shakespeare’s complete plays, "Moby-Dick," and "Marijuana: Facts for Teens," a brochure published by the National Institute on Drug Abuse.
The bigger problem, honestly, is that the filters then lead to complacency. Once the schools have filters in place, it gives officials a false sense that things are "safe." And yet, plenty of bad stuff gets through, while good stuff gets blocked. This isn't to say that filters have no use at all, but it's about time people learned not to rely on them so heavily.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: censorship, filtering
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in thread ]
What ends up happening is that the filter only blocks the good stuff because people intentionally design the bad stuff to circumvent the filters while the good stuff is not as strongly designed to do so.
[ link to this | view in thread ]
[ link to this | view in thread ]
Re:
The bad stuff meanwhile has the freedom of being completely polymorphic.
White-listing things won't work because there's simply too much and it would overload the system.
Black-listing doesn't work either because you're back to the game of whack-a-mole.
The only way to win the game is to have a real, live, human monitoring the situation in the real world.
I think that by the time 5th or 6th grade rolls around there isn't any information that should be taboo or restricted; it dangerously creates a distorted world view, endangers the child to real world threats via willful enforcement of their ignorance, and doesn't actually stop them from accessing that same information elsewhere.
It's ineffective, harmful, and wasteful of resources; just outright a silly idea.
[ link to this | view in thread ]
[ link to this | view in thread ]
Why Doesn't This Get Fixed?
Give each student, child, or staffer a unique "user code". Insert an "over-ride" button. When someone requests content that would normally get blocked, they have the option to click the over-ride button, then they would need to enter their code. They could then go to the site, unfettered in real-time.
BUT, for every use of "over ride", a report is sent to the vice-principal, boss, or mom about who clicked the button and what blocked site they visited. They would then have the option of adding that site to a white list of black list.
Then, the blocking software company would get a report of sites that they should re-consider. It's systems like this, which use the power of the crowd, a crowd of human observers (not machines and rules) which produce better, more human decisions. And it's easy to do using them thar interwebs.
Problem solved. Well, not solved, but a heck of a lot better than before.
[ link to this | view in thread ]
Re: Why Doesn't This Get Fixed?
[ link to this | view in thread ]
RE: Why Doesn't This Get Fixed?
[ link to this | view in thread ]
http://en.wikipedia.org/wiki/Scunthorpe_problem
There are a lot of other examples.
[ link to this | view in thread ]
Re:
(Ducking the blows raining down upon my trolling head). April Fool's.
[ link to this | view in thread ]
Solution?
Even though "all the kids have iPhones with 3G", that still doesn't negate a schools responsibility. Just like in the days before the internet, parents are responsible for what happens away from school, and in the case of the iPhones, I would think that parents would be responsible for that. But, just like I would be upset if my school subscribed to Playboy (the nudie magazine) I would be upset if my child was able to access Playboy.com the porn website from the schools computers.
All this being said, what is the real world solution?
[ link to this | view in thread ]
Filtering does more good than harm
The flexibility given to filtering software COULD be limitless, but unfortunately it seems that they programmers are more focused on blocking as much as possible.
Does filtering stop everything? No. Does filtering block safe sites? Yes. Does filtering do its job regardless? I think so.
I used to work with an administrator at a school doing a few contracts with him to install software, etc.. While there, before any filters were up, if you looked into the computer class there was maybe 3 out of a class of 30 working on the assigned project. The other children would all be on facebook, myspace, youtube, free flash game site, etc.. After implementing the filter, although it wasn't fool proof, that number of children paying attention to their lessons jumped exponentially. The odd time he said there were complaints that it was blocking too much, but that's a small price to pay when the average grade of students goes up.
Just a thought... Before the internet, people used something called an Encyclopedia to find information. Last I checked, they were still around! If you need to find information on something and you can't find it on the internet, rather than getting frustrated, open a book!
[ link to this | view in thread ]
Re: Solution?
By making things forbidden by an authority, you tend to dramatically increase its appeal. Instead of attempting to shelter children from "bad things" en masse, teaching children (and adults too) critical thinking skills. Teach them how to discern reliable information from unreliable nonsense. Teach them why it wouldn't be a good idea to spend all day looking at porn instead of doing their homework.
I see no problem in exposing children to new ideas or experiences as long as they have been given the mental tools to process those experiences in a healthy manner.
[ link to this | view in thread ]
good enough
At work we block all social sites, streaming video and audio, gaming sites, sites with questionable content. Although when you get to a bad site it automatically presents you with simple form to fill out to have the site reviewed and added to a safe content list. Yeah the guy that spends his time on gambling sites fills it out and claims it is work related to maintain his gambling habits - so sad NOW BACK TO REAL WORK!
[ link to this | view in thread ]
Re: Re: Solution?
[ link to this | view in thread ]
Re: Re: Solution?
Yay Streizand Effect!
> Teach them how to discern reliable information from unreliable nonsense.
*gasp!* Are you implying that... CHILDREN SHOULD BE ALLOWED TO THINK?!? Blasphemy!
[ link to this | view in thread ]
Re: Why Doesn't This Get Fixed?
[ link to this | view in thread ]
Re: Re: Why Doesn't This Get Fixed?
[ link to this | view in thread ]
The Filter Fallacy
[ link to this | view in thread ]
[ link to this | view in thread ]
[ link to this | view in thread ]