How Internet Filtering Blocks All Sorts Of Legit Info

from the here-we-go-again dept

This has been pointed out before, but it's received less attention recently, despite the rise in interest in internet filters: the filters don't work very well. In fact, Mitch Wagner went looking and found all sorts of examples of internet filters in schools blocking all sorts of legit info:
The Canadian National History Society was forced to change the name of its magazine, The Beaver, founded in 1920, because the name of the magazine caused it to be blocked by Internet filters.

One teacher wanted to show students some pictures that would illustrate the effects of atomic testing. "However when I went to bring the wikipedia page up at school during class, it was blocked by our internet filter, BESS. The name of the islands? 'Bikini Atoll,'" said Doug Johnson, quoting the teacher. Johnson, a director of media and technology at a Minnesota school district, put out a call in July for stories about how Internet filtering hobbles education, and got an earful. ("Censorship by Omission")

Johnson also shares a message from another teacher, describing how a school's systems security manager decided to block the social bookmarking site delicous.com. The reason? You can use the site to search for porn....

The problem goes back for years. A filter blocked the Web site of former House Majoirty Leader Richard Armey because it detected the word "dick," according to "Internet Filters, a public policy report," a 2001 study from the Brennan Center of Justice. Other software blocked the Declaration of Independence, Shakespeare’s complete plays, "Moby-Dick," and "Marijuana: Facts for Teens," a brochure published by the National Institute on Drug Abuse.
Again, some of these stories appear to be old ones, which the filtering industry insists have been fixed, but these kinds of filters will always create false positives.

The bigger problem, honestly, is that the filters then lead to complacency. Once the schools have filters in place, it gives officials a false sense that things are "safe." And yet, plenty of bad stuff gets through, while good stuff gets blocked. This isn't to say that filters have no use at all, but it's about time people learned not to rely on them so heavily.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: censorship, filtering


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    Marcus Carab (profile), 31 Mar 2010 @ 6:31pm

    Don't half the kids have iPhones and 3G now anyway?

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 31 Mar 2010 @ 7:03pm

    "And yet, plenty of bad stuff gets through, while good stuff gets blocked."

    What ends up happening is that the filter only blocks the good stuff because people intentionally design the bad stuff to circumvent the filters while the good stuff is not as strongly designed to do so.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 31 Mar 2010 @ 7:03pm

    is it april 1st already???

    link to this | view in thread ]

  4. icon
    Michael (profile), 31 Mar 2010 @ 7:37pm

    Re:

    It's even worse than that. The Good stuff might be unable to change facts, such as the names of islands, people, historic event titles; all of which could potentially contain improper language. Even for example factual descriptions of ethnic hate, cleansing, and various barbaric acts committed by the persecuting sides.

    The bad stuff meanwhile has the freedom of being completely polymorphic.

    White-listing things won't work because there's simply too much and it would overload the system.
    Black-listing doesn't work either because you're back to the game of whack-a-mole.

    The only way to win the game is to have a real, live, human monitoring the situation in the real world.

    I think that by the time 5th or 6th grade rolls around there isn't any information that should be taboo or restricted; it dangerously creates a distorted world view, endangers the child to real world threats via willful enforcement of their ignorance, and doesn't actually stop them from accessing that same information elsewhere.

    It's ineffective, harmful, and wasteful of resources; just outright a silly idea.

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 31 Mar 2010 @ 8:02pm

    My middle school used BESS. It was a joke, and after awhile pretty much every single student knew more than one way to bypass the filter. It frequently blocked sites for having "too many words" from some secret (to us) list of evil words, and the limit wasn't even relative to page size/length, just an arbitrary number.

    link to this | view in thread ]

  6. icon
    Derek Kerton (profile), 31 Mar 2010 @ 8:27pm

    Why Doesn't This Get Fixed?

    This would be easy to fix if the filter makers had any sense of innovation:

    Give each student, child, or staffer a unique "user code". Insert an "over-ride" button. When someone requests content that would normally get blocked, they have the option to click the over-ride button, then they would need to enter their code. They could then go to the site, unfettered in real-time.

    BUT, for every use of "over ride", a report is sent to the vice-principal, boss, or mom about who clicked the button and what blocked site they visited. They would then have the option of adding that site to a white list of black list.

    Then, the blocking software company would get a report of sites that they should re-consider. It's systems like this, which use the power of the crowd, a crowd of human observers (not machines and rules) which produce better, more human decisions. And it's easy to do using them thar interwebs.

    Problem solved. Well, not solved, but a heck of a lot better than before.

    link to this | view in thread ]

  7. icon
    Hephaestus (profile), 31 Mar 2010 @ 8:47pm

    Re: Why Doesn't This Get Fixed?

    What you are suggesting is the Anti-google only the least relevant results will show up. Everyone is offended by something.

    link to this | view in thread ]

  8. icon
    Blatant Coward (profile), 31 Mar 2010 @ 9:18pm

    RE: Why Doesn't This Get Fixed?

    You are also looking at, for schools and federal institutions at least, adding a long term support package or tech support fee making the software 'less economically viable' than other products. Engineers won't be buying this stuff; part time bureaucrats will.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 31 Mar 2010 @ 9:42pm

    This is, interestingly, known as the "Scunthorpe problem." One of the earlier occurrences was with the town of Scunthorpe, which contains the substring "cunt."

    http://en.wikipedia.org/wiki/Scunthorpe_problem

    There are a lot of other examples.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 31 Mar 2010 @ 10:16pm

    Re:

    Maybe they should have thought about that before they named the town after a woman's vagina.

    (Ducking the blows raining down upon my trolling head). April Fool's.

    link to this | view in thread ]

  11. icon
    Matthew Cruse (profile), 31 Mar 2010 @ 10:39pm

    Solution?

    So what is the real world solution? It is impractical for most school districts to have a full time IT person at every school, which is what would be required to have a real-time "human" filtering process in place. It is impractical for a teacher to watch everything 30 students do while in a computer lab, or to vet every search or page access/red flag. However I, and many others, feel that there should be some kind of gatekeeper/watchdog at the access point of our children to the wider world outside of school and home. In the old days this person was the school librarian (or school board in the case of text books). He/She ordered the books and periodicals for the school and vetted all of the incoming material. They also helped restrict access to age appropriate material, i.e.2nd graders not having access to the Human Sexuality texts, but the 7th/8th graders did have access.
    Even though "all the kids have iPhones with 3G", that still doesn't negate a schools responsibility. Just like in the days before the internet, parents are responsible for what happens away from school, and in the case of the iPhones, I would think that parents would be responsible for that. But, just like I would be upset if my school subscribed to Playboy (the nudie magazine) I would be upset if my child was able to access Playboy.com the porn website from the schools computers.
    All this being said, what is the real world solution?

    link to this | view in thread ]

  12. identicon
    Chad, 31 Mar 2010 @ 11:10pm

    Filtering does more good than harm

    Reading the comment above about overriding, it just makes sense that software should be able to do that. Give administrators the ability to override the filter, the ability to white-list domains and certain keywords at demand.... even if it was only unlocked for a certain amount of time.

    The flexibility given to filtering software COULD be limitless, but unfortunately it seems that they programmers are more focused on blocking as much as possible.

    Does filtering stop everything? No. Does filtering block safe sites? Yes. Does filtering do its job regardless? I think so.

    I used to work with an administrator at a school doing a few contracts with him to install software, etc.. While there, before any filters were up, if you looked into the computer class there was maybe 3 out of a class of 30 working on the assigned project. The other children would all be on facebook, myspace, youtube, free flash game site, etc.. After implementing the filter, although it wasn't fool proof, that number of children paying attention to their lessons jumped exponentially. The odd time he said there were complaints that it was blocking too much, but that's a small price to pay when the average grade of students goes up.

    Just a thought... Before the internet, people used something called an Encyclopedia to find information. Last I checked, they were still around! If you need to find information on something and you can't find it on the internet, rather than getting frustrated, open a book!

    link to this | view in thread ]

  13. icon
    Josh in CharlotteNC (profile), 1 Apr 2010 @ 6:36am

    Re: Solution?

    "All this being said, what is the real world solution?"

    By making things forbidden by an authority, you tend to dramatically increase its appeal. Instead of attempting to shelter children from "bad things" en masse, teaching children (and adults too) critical thinking skills. Teach them how to discern reliable information from unreliable nonsense. Teach them why it wouldn't be a good idea to spend all day looking at porn instead of doing their homework.

    I see no problem in exposing children to new ideas or experiences as long as they have been given the mental tools to process those experiences in a healthy manner.

    link to this | view in thread ]

  14. icon
    Jimr (profile), 1 Apr 2010 @ 6:50am

    good enough

    filters have their place. They are not 100% full proof - nothing is.

    At work we block all social sites, streaming video and audio, gaming sites, sites with questionable content. Although when you get to a bad site it automatically presents you with simple form to fill out to have the site reviewed and added to a safe content list. Yeah the guy that spends his time on gambling sites fills it out and claims it is work related to maintain his gambling habits - so sad NOW BACK TO REAL WORK!

    link to this | view in thread ]

  15. icon
    Wesha (profile), 1 Apr 2010 @ 6:53am

    Re: Re: Solution?

    > By making things forbidden by an authority, you tend to dramatically increase its appeal Yay Streizand Effect!

    link to this | view in thread ]

  16. icon
    Wesha (profile), 1 Apr 2010 @ 6:55am

    Re: Re: Solution?

    > By making things forbidden by an authority, you tend to dramatically increase its appeal

    Yay Streizand Effect!

    > Teach them how to discern reliable information from unreliable nonsense.


    *gasp!* Are you implying that... CHILDREN SHOULD BE ALLOWED TO THINK?!? Blasphemy!

    link to this | view in thread ]

  17. identicon
    Another User, 1 Apr 2010 @ 7:47am

    Re: Why Doesn't This Get Fixed?

    I work at a school and we have a system very similar to this called Websense. The filter is designed to work with active directory and has the option on a lot of sites to use quota time. Most of the time this filter does an excellent job at blocking the bad content and letting through the good content. Also there is a process that allows the student to request a site to be unblocked if they believe it will help them with there school work.

    link to this | view in thread ]

  18. icon
    Derek Kerton (profile), 1 Apr 2010 @ 11:43pm

    Re: Re: Why Doesn't This Get Fixed?

    Yeah, but blocking content that offends just a few is the status quo. And using too general keywords has false positives. Too much is blocked. I'm suggesting a process to allow someone to view the content EVEN when it is blocked.

    link to this | view in thread ]

  19. identicon
    Donald Lindsey, 17 Aug 2010 @ 6:40pm

    The Filter Fallacy

    I agree with the author: you can't rely on a filter to protect yourself and your family; however, you can purchase an inexpensive subscription for a filter which includes age-based sensitivity levels and the option for a Filter Guardian (a teacher, parent, or mentor who can override illegitimate blocked sites: The Covenant Eyes Dynamic Filtering System is by far the most intelligent and innovative filter on the market. Take a look via my website to take advantage of a free 30 day trial: www.cultureofaccountability.org

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 22 Sep 2010 @ 8:55am

    The whole idea of seceruty filters in school is ridiculous. If a teacher is always present, the students should not be able to get away with looking up inappropriate content. If they can control students in a classroom they should stop being lazy and watch kids on the computer

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 13 Jan 2011 @ 9:57am

    k;lk

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.