Contrary To The Claims Of Grandstanding Politicians, Child Porn Is Very Difficult To Stumble Onto Accidentally

from the so-omnipresent-hardly-anyone-ever-sees-it dept

Google has decided to be even more "proactive" in fighting child pornography, crafting a database of flagged images that will be made available to law enforcement, investigators and even its own competitors. Somehow the company plans to make it searchable while simultaneously deleting the offending items from the web.

There's no reason for Google to be doing this other than as a response to the UK government's consensus that Google = Internet, and is therefore responsible for policing everything it crawls. Unfortunately, many offending images will remain beyond the reach of Google. Additionally, turning a hunt for child porn into an algorithmic search will lead to false positives and deletions, as anyone familiar with ContentID and YouTube can readily attest.

The politicians crusading for a child porn-free internet will be satiated. Google's new offensive plays to their strengths, namely:

1. Proclaiming something must be done.
2. Allowing someone else to do that "something."

The UK's current porn-blocking efforts (of regular, legal porn) are a comedy of errors. "Child safety" filtering on mobile networks has already resulted in the mistaken blocking of YouTube, Orange, and The Jargon File. With these filters becoming mandatory next year, more and more sites will find themselves cut off from their users due to the general ineptness of blocking software crafted at the behest of hand-wringing bureaucrats.

Child porn, however, remains the true enemy, especially in Britain, where its profile is heightened due to recent events. In the oft-echoed call for someone (namely, Google) to do something about child porn, a rather startling statistic was quoted. According to the Internet Watch Foundation (IWF -- an industry-funded group that compiles lists of keywords and illegal abuse sites for subsequent banning by Google, et al), "more than 1.5 million internet users in the UK mistakenly viewed child abuse images last year." (Only 40,000 were reported to the IWF, a point which is left open to speculation.)

It's a rather alarming number. But is it accurate? UK website Ministry of Truth went digging into the math behind this "statistic." The "1.5 million" quote above was pulled from an IWF press release that offered no citations. Perusing the IWF's site itself, MoT found another press release that applied a bit of hedging to the claim.
New study reveals child sexual abuse content as top online concern and potentially 1.5m adults have stumbled upon it.
Note that one word that changes everything.
Hang on a second, we’ve just gone from “1.5 million adults have stumbled across” child porn to “potentially 1.5 million adults have stumbled upon it”, which rather starts to suggest that the IWF’s “study” might not be quite what they’re making it out to be and, sure enough, a little further down the page we hit paydirt:

The ComRes poll conducted among a representative sample of 2058 British adults for the Internet Watch Foundation (IWF) shows the vast majority of people in Britain think that child sexual abuse content (“child pornography”) (91%) and computer generated images or cartoons of child sexual abuse (85%) should be removed from the internet.

Riiiiight… so it’s not actually a study, it’s an opinion poll; a grade of evidence that generally sits just above the story you heard from a bloke down the pub who swears blind that his cousin’s boyfriend knows a bloke who knows the bloke that it actually happened to.

Long story short (although the long story is a very interesting read), the poll used skewed demographics (weighted heavily towards the 55-and-older set) to produce this meaningless percentage:
"- 3% have seen/encountered 'Child pornography'"

According to the 2011 Census the adult population of Great Britain is just over 48.1 million and 3% of that is a little under 1.43 million people, which the IWF has rounded up to 1.5 million (ignoring the usual rules on rounding) for its press releases.
The problem with accepting this at face value (and then attaching it to multiple press releases) are numerous. For starters, as many as 1 in 7 UK citizens have never used a computer, much less have internet access. For another, one person's "child porn" is another person's "adult film starring consenting, paid adults." One needs look no further than the Daily Mail's disastrous attempt to show how easy it was to find child porn simply by using the same search terms as those found on a convicted child killer's internet history.

The Mail's Amanda Platell claimed to have taken a journey to the "hell known as internet child porn." Unfortunately, her only souvenir from the trip was a misidentified clip from a 13-year-old (adult) porn film. True, the content of the film would be repulsive to many (simulated sexual assault), but the film was made and distributed legally.

Not only are the number of "potential" child porn viewers lower than the IWF claims, but the number of readily accessible pages containing child porn images on the internet is more "rounding error" than panic-worthy.

Here are the numbers the IWF came up with in its 2012 report.
In total, the IWF found 9,550 web pages that hosted child sexual abuse content spread across 1,561 internet domains in 38 different countries. 60% of the child sexual abuse content identified by the IWF was found on ‘one click hosting website’, i.e. a file hosting service/cyberlocker which, for reasons known only to itself, the IWF insists on referring to as a ‘web locker’ despite the fact that no else else seems to use that particular phrase.
A brief glance at that total should readily tell you the percentage is insignificant. And this is a number compiled by a group tasked with hunting down child pornography, an entity that would have a much higher hit rate than the average person browsing the web. Here's how it stacks up to the whole of the internet.
Out of an estimated 14,8 billion indexed web pages, the British public reported just 9,696 web pages (0.000065%) containing child pornography to the IWF in the whole of 2012.

In that same year, just 1561 internet domains (0.001%) were reported to the IWF that were found to contain child pornography out of a minimum of 145.5 million registered domains (and that’s just for five gTLDs and one country specific domain).

In fact, on a single ordinary day in May 2013, 92 times as many new domains were registered across just the six TLDs we have figures for, than were reported and found to be hosting child porn by members of the UK general public in the whole of 2012.
How hard would it be to access child porn if you weren't looking for it specifically? The Ministry of Truth puts your odds at 1 in 2.6 million searches. (MoT points out the odds will fluctuate depending on search terms used, but for the most part, it's not the sort of thing someone unwittingly stumbles upon.)

All those demanding Google do more to block child porn fail to realize there's not much more it can do. The UK already has an underlying blocking system filtering out illegal images at the ISP level, and Google itself runs its own blocker as well.

The above calculations should put the child porn "epidemic" in perspective. As far as the web that Google actively "controls," it's doing about as much as it can to keep child porn and internet users separated. There are millions of pages Google can't or doesn't index and those actively looking for this material will still be able to find it. Google (and most other "internet companies") can't really do more than they're already doing already. But every time a child pornography-related, high profile crime hits the courtroom (either in the UK or the US), the politicians instantly begin pointing fingers at ISPs and search engines, claiming they're not doing "enough" to clean up the internet, something that explicitly isn't in their job description. And yet, they do more in an attempt to satiate the ignorant hunger of opportunistic legislators.

If Google is "the face of the internet" as so many finger pointers claim, than the "internet" it "patrols" is well over 99% free of illegal images, according to a respected watchdog group. But accepting that fact means appearing unwilling to "do something," an unacceptable option for most politicians.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: child porn, grandstanding, politicians


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Ninja (profile), 20 Jun 2013 @ 7:54am

    Even if you do look for child porn actively you'll find it very hard to find. Most likely you'll find pics of adult actresses that look very young. There's this Chinese mangaka that's in her 30s who looks like she is 14. Besides it'd take an incredibly dumb person to host such material with open access. I'd guess child porn is kept in private servers very heavily protected.

    That's not how you deal with child porn. You get the police to do their job, infiltrate in the networks where it's distributed and make hell break loose.

    But something must be done. Except that we want to make the least effort possible and we are not really worried if it will be effective, eh?

    link to this | view in chronology ]

    • identicon
      gnudist, 20 Jun 2013 @ 1:29pm

      Re:

      You seem disturbingly expert on the subject

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 20 Jun 2013 @ 1:49pm

        Re: You seem disturbingly expert on the subject

        Indeed it is fear of public notoriety that mainly prevents people from having a rational discussion about these issues. Peer pressure and shame puts a very strong chilling effect on speech that questions the mainstream assumptions.

        link to this | view in chronology ]

      • icon
        PaulT (profile), 21 Jun 2013 @ 12:50am

        Re: Re:

        "I'm familiar with a single porn actress who's very young looking" = "disturbingly expert"?

        I think we can guess why so many opinions stated here by ACs are some obviously wrong if passing familiarity with a subject is suspicious...

        link to this | view in chronology ]

    • identicon
      Rekrul, 20 Jun 2013 @ 1:55pm

      Re:

      Even if you do look for child porn actively you'll find it very hard to find.

      Don't forget that many countries have criminalized drawings of underage sex as well as photos. So even manga is no longer safe.

      link to this | view in chronology ]

      • icon
        Not an Electronic Rodent (profile), 22 Jun 2013 @ 1:23pm

        Re: Re:

        Don't forget that many countries have criminalized drawings of underage sex as well as photos.
        Yeah... talk about begging for false positives. How the f*ck do you tell how old a drawing is???

        link to this | view in chronology ]

        • icon
          PaulT (profile), 23 Jun 2013 @ 1:19am

          Re: Re: Re:

          More to the point, who cares? Age of consent and child sex laws are there to prevent abuse and exploitation of minors. A drawing cannot be abused or exploited. So who the hell cares how old it is, even if it represents something that would be disturbing in real life?

          link to this | view in chronology ]

          • icon
            nasch (profile), 24 Jun 2013 @ 7:21am

            Re: Re: Re: Re:

            Perhaps we should be encouraging child porn drawings. Maybe if they were more available and with fewer consequences there would be less demand for the real thing, and therefore less supply (in other words fewer children abused).

            link to this | view in chronology ]

        • identicon
          Rekrul, 24 Jun 2013 @ 4:13pm

          Re: Re: Re:

          Not to mention that some countries in Europe have lower age of consent laws than the US. I think the AoC in Britain is 16. So images and videos that would be legal there are illegal here. So even visiting a porn site that seems to be legal can get you in trouble.

          link to this | view in chronology ]

          • icon
            nasch (profile), 24 Jun 2013 @ 8:59pm

            Re: Re: Re: Re:

            I think the AoC in Britain is 16.

            Pretty sure it used to be 16 but went up, so stuff that used to be legal now isn't.

            link to this | view in chronology ]

            • icon
              PaulT (profile), 25 Jun 2013 @ 1:43am

              Re: Re: Re: Re: Re:

              It's complicated. I believe the situation is currently that the overall age of consent is 16, but that it's 18 when relating to either sex with older people "in a position of trust" (e.g. teachers) or employment for professional work (porn, topless modelling, etc). In other European countries, it can be as young as 14.

              link to this | view in chronology ]

            • icon
              Not an Electronic Rodent (profile), 25 Jun 2013 @ 10:08am

              Re: Re: Re: Re: Re:

              Pretty sure it used to be 16 but went up, so stuff that used to be legal now isn't.
              It's 16, but you can't look at or appear in "adult images" until 18... so looking at real naked 16-year olds is apparently fine but not a picture of one... now that makes sense...
              The argument is I believe that "looking at drawings might lead to a thirst for more 'real' things". No-one ever seems to consider or even acknowledge the flip-side to that coin where drawings might assuage the urge and prevent something actually bad...

              link to this | view in chronology ]

              • icon
                nasch (profile), 25 Jun 2013 @ 11:15am

                Re: Re: Re: Re: Re: Re:

                The argument is I believe that "looking at drawings might lead to a thirst for more 'real' things".

                Which are actually legal to do. Makes perfect sense. *facepalm* I think a more compelling (less nonsensical) argument would be that it's OK for a 16 year old to have sex with an adult, but that society (and the individuals) is better off if 16 year olds aren't working in porn or prostitution. That is, it's not worse to look at a picture of a naked 16 year old than it is to have sex with one, but it is worse to pay a 16 year old to take his or her clothes off than to do it for non-monetary reasons. That sort of sidesteps the issue of naked photos for fun, though, whether selfies or taken by a boyfriend or what have you.

                link to this | view in chronology ]

  • icon
    Zakida Paul (profile), 20 Jun 2013 @ 8:41am

    I love all these calls for Google to do more to stop child porn.

    Contrary to popular belief, paedophiles are not getting this material from Google. They get it from the darknet via peer to peer sharing networks; and they use techniques to hide what they are doing. Anything Google (or even ISPs, for that matter) do will be totally ineffectual.

    The problem of child porn will not be solved with a technological solution because it is a social problem and, as such, requires a social solution.

    link to this | view in chronology ]

    • identicon
      Anonymous, 20 Jun 2013 @ 4:23pm

      Re:

      I'm not sure how some guy accessing and looking at pictures in the privacy of his own home is a social problem.

      link to this | view in chronology ]

      • icon
        PaulT (profile), 21 Jun 2013 @ 12:54am

        Re: Re:

        The problem is that in order for child porn to be produced, a child has to be abused. So, even if he's just looking at a collection from home, child abuse and possibly rape has taken place somewhere in order for him to do so. That's a social problem, no matter how you spin it - which is why we need to go after those who produce it, not Google or people who stumble across it.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Jun 2013 @ 8:43am

    Sometimes, people PUT child porn in front of you, especially if someone has a grudge. It has happened to me. The standard definition of 'child porn' is porn involving children. They seem to think it's something different. You can easily run across things they want to ban but contain no children.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Jun 2013 @ 8:45am

    You know... Because all pedophiles only use Google...

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Jun 2013 @ 8:51am

    Can someone explain on what grounds Google is being made the nets policeman at its own expense? Presumably Google is also liable for prosecution if they fail in this task, like fail to find any paedophiles and their pictures.

    link to this | view in chronology ]

    • icon
      Violated (profile), 20 Jun 2013 @ 12:12pm

      Re:

      Google has never been liable for links. They just like to make out they are for some strange reason.

      Now they are the willing masochist that everyone beats on.

      link to this | view in chronology ]

  • icon
    ChurchHatesTucker (profile), 20 Jun 2013 @ 8:52am

    Can we get the NSA to turn over the browsing history of these politicians? I'd love to see how they "stumble" across this stuff.

    link to this | view in chronology ]

  • identicon
    Michael, 20 Jun 2013 @ 9:25am

    I'm not so sure...

    It seems to me that any time I hear about someone finding child porn on someone's computer they immediately say: "That just 'popped' up!", "That was an accident!", or "It wasn't me, it must have been a popup that downloaded the 451 movies!"

    Yeah. A bit like your wife finding a pile of singles and an ATM receipt from a gentleman's club balled up in your pocket and suddenly that kind of thing apparently appear's in pockets all the time.

    link to this | view in chronology ]

    • icon
      Ninja (profile), 20 Jun 2013 @ 9:31am

      Re: I'm not so sure...

      Well, it worked well when mom found my porn stash. I told her those pics were cached from not so nice advertisement in sites*. Apparently it works with law enforcement then?

      * I think she pretended to be fooled and I pretended it was a good excuse but that's another story.

      EVIL CACHES, SUE MOZILLA FOR ALL THE CHILD PORN.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 20 Jun 2013 @ 10:38am

        Re: Re: I'm not so sure...

        Yeah, no one wants to hear about your mommy porn issues, thanks.

        link to this | view in chronology ]

    • icon
      Zakida Paul (profile), 20 Jun 2013 @ 9:37am

      Re: I'm not so sure...

      Or the classic "it is for an article I am researching".

      link to this | view in chronology ]

      • icon
        Sheogorath (profile), 20 Jun 2013 @ 6:37pm

        Re: Re: I'm not so sure...

        Or the classic "it is for an article I am researching".
        That was Pete Townshend's excuse, and it didn't work out so well for him, either.

        link to this | view in chronology ]

  • identicon
    Sean, 20 Jun 2013 @ 9:28am

    Easy as . . .

    I have never came across child porn in my life. Of course I do not try to even look are regular adult porn so that makes it a little more difficult and there has only been one time that I accidentally came across any type of porn and that was when a radio station's webmaster forgot to renew their domain. It got registered by a porn producer.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Jun 2013 @ 10:22am

    Child porn may be hard to find but Pedobear is everywhere.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Jun 2013 @ 10:48am

    Sounds more like an excuse to extract more of your rights whilst making you pay for the privilege.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Jun 2013 @ 10:50am

    whenever there is something on the 'net that someone doesn't like, they always make a big play of Google. the thick fuckers in the UK, who think it is so easy to stoop this stuff and that there will be no harm done except to the parts of the 'net that deserve it are about as informed as my big toe is on mars exploration. if it were so easy, then things could perhaps be achieved. however, the entertainment industries have been saying for years how easy it is to identify and block infringing material. we have seen how easy it is. they couldn't even identify their own stuff , let alone block what should have been. anyone with a mouth as big as Perry's can lay blame at the door of whoever. when in a 'discussion' (and i use the term loosely) with someone on TV, she won the whole thing hands down. the thing was, she just wouldn't let the other side say anything! anyone can win a debate if the other side is never allowed to say anything! her biggest problem, however, is not the mouth that just wont keep closed, it's the stupidity of her whole idea. all that is going to happen is that the issue will be forced underground and make it all but impossible for the police to catch those responsible! that, however, doesn't matter to people like her! being in the limelight and having her name associated with an 'unsavory internet process' is what matters!

    link to this | view in chronology ]

  • identicon
    mik, 20 Jun 2013 @ 11:06am

    realy is it soo hard

    “child pornography / if you looking for ascitently hit the 1 one/ real fing go use kazza-lite download some of games movies and you may hit it or use emule same time you may allways hit cop honypot
    /
    computer generated images or cartoons of child sexual abuse
    2(toes not take a sec to find stuff) is ewen easier use google tipe lol con shearch use google corecsion pics /
    1.3 use tgp press reapeadetly pics that open to oder legal tgp's and you may soon be in illigal waders no looking needed
    /
    1.4 find a rank list japans and you have acsess to all gainda stuff
    /
    1.5look for tinami.com use link surf ower links
    -
    + getchu
    http://www.google.com/search?client=opera&q=getchu&sourceid=opera&ie=utf-8&oe=utf- 8&channel=suggest
    dlsite/
    Cosplay (105)
    Fantasy (799)
    Heartwarming (1004)
    Hilarious (1018)
    Magical Girl (203)
    Moe (332)
    Nekomimi (Catgirl) (176)
    Puni (195)
    Robot (124)
    School (461)
    SF (256)
    Uniform (571)
    Yaoi (189)
    Yuri/Girls Love (196)
    /
    and so on and on
    if you stuppid you simply look for stuff if you have brains you have vpn

    link to this | view in chronology ]

  • icon
    Violated (profile), 20 Jun 2013 @ 11:33am

    Let us be quite clear in that our fascist UK Government is only using Child Porn (and other lawful but distasteful porn) in order to block access to standard adult porn.

    Not to overlook that just about every Internet using UK citizen is totally hating them for doing this.

    Had they really wanted to attack child porn then they are over a decade too late to join that party. The Internet population did choose to self censor where with the aid of law enforcement the Web was cleaned. So let us be clear what they find now namely one questionable image on an otherwise adult site where even that image is simply a lawful age model who simply looks younger than her true age.

    So the home of CP now are the dark nets like Tor

    link to this | view in chronology ]

    • icon
      Violated (profile), 20 Jun 2013 @ 11:56am

      Re:

      Damn a cut off malfunction. No edit so to continue...

      So the home of CP now are the dark nets like Tor Core but even there you wont find CP unless you go through many sites and links in an active hunt for it.

      I have seen an UK adult site filter in action before on a mobile SIM but I gave up all hope on it when many adult sites were not blocked because they were unknown to them. Then many innocent sites did get blocked like the ASCII archive. Block ASCII art really?

      So this mandatory policy only makes the situation worse by fooling parents into a false sense of security.

      To top this off then it is well known that easy porn access has led to a large drop in sexual crimes. So to save us from standard porn would mean more people getting molested and raped.

      Then all this in a country where national news papers read by all ages get women naked proving to all what topless women look like.

      link to this | view in chronology ]

      • identicon
        tad, 23 Jul 2013 @ 6:37pm

        Re: Re:

        "To top this off then it is well known that easy porn access has led to a large drop in sexual crimes. So to save us from standard porn would mean more people getting molested and raped."

        This is an interesting side note.. I'd love to know of any evidence to support this?

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Jun 2013 @ 11:58am

    On the porn tubes you can find the occasional tracy lords before 18 clip,

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Jun 2013 @ 1:13pm

    What is child porn?

    A point mentioned but not explored is, what is defined as CP? You ask 10 different people you'll get 10 different answers. Simple nudity should not be conflated with hardcore porn, yet I'm sure it regularly is. In some US states a wet t-shirt can define an image as CP, but in most it does not. Age of the "victim" makes a big difference too. A 17 yo posing topless is in a completely different category than a 12 yo being forced to perform penetrative sex acts, and there is a whole range therein. To conflate the two is beyond ridiculous. And drawings and cartoons? How are you even going to begin to logically classify those?

    I will say that, if you search for porn a lot, it is not hard to stumble across an image of child porn, especially if we are talking softcore "porn". Although I have not tried (for obvious reasons -how many people are really willing to test the truth of the alarmist's assertions?), my sense is it is rather difficult to find large quantities of "quality" CP.

    A very great danger is the use of CP images to blackmail/shakedown people, including people in positions of power. It is all too easy for a few images of softcore borderline porn to get mixed in with legal material. Its far to easy to plant this kind of stuff on someone's computer. That is why possession of CP should be decriminalized. Not legalized, it would still be contraband, and possession would be no more than a violation. The distinction between hardcore and softcore should be taken into account as well. Distribution and obviously production would still be crimes of graduating severity.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Jun 2013 @ 3:04pm

    false positves not a problem

    The recent media reports of Google's pro-active "effort to eradicate child abuse imagery online" are all based on a blog entry that was clearly written by someone at Google who is non-technical.
    http://googleblog.blogspot.com/2013/06/our-continued-commitment-to-combating.html
    It is hard to tell from this what exactly Google has done on its own in this effort apart from providing money, software and hardware to other groups trying to identify and maybe filter out child porn from the internet.

    A few basics. Google uses a database of hashes to identify copies of known child porn images. It is not clear that Google itself has added to the databases that have been created through law enforcement efforts. The hashes traditionally have used MD5 which has cryptographical weaknesses related to someone designing a file that, when hashed, will match a target hash value. This is still very hard to do so the problem of false positives is pretty much nonexistent. This issue may become important if there is an attempt to use hash values of encrypted files in court to prove possession of child porn. I must emphasize that with a 16 byte (128 bit) hash collisions are exceedingly unlikely. Unlike the algorithms used for ContentId there will not be a problem with false positives.
    The Google blog says:
    "Recently, we’ve started working to incorporate encrypted “fingerprints” of child sexual abuse images into a cross-industry database."
    I believe that should be corrected to say that they are incorporating fingerprints of encrypted child sexual abuse images into a cross-industry database. That is, they are trying to identify encrypted child porn by hashing those files as well. This is potentially useful but assumes that pedophiles don't re-encrypt images for storage or further distribution.
    It is not clear, from the blog, whether Google actually filters out search results for cp files, or web pages containing such files. Nor is it clear that Google uses any other methods to identify such files (e.g. searching gmail accounts for matches).
    It is fairly easy to defeat identification through a hash database by altering the image file in absolutely any minimal way. Apparently, law enforcement has had pretty good success in identifying known porn images through this method, so the usual conventional wisdom that most criminals are stupid seems to be true. My impression of Google's blog is that it is a feel-good PR piece that makes it seems like Google is doing a lot when it really isn't. Not that they should have to. Law enforcement should welcome the status quo as they can identify and track pedophiles because they are using the internet to exchange files that are not uniquely encrypted. If such exchanges become impossible due to filtering from companies such as Google then exchanges will be pushed further underground with the use of cryptography so that even law enforcement will have a hard time identifying and tracking pedophiles.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Jun 2013 @ 3:21pm

    than the "internet" [google] "patrols" is well over 99% free of illegal images

    So, how high is the crime rate of this "britain" the uk political corpus patrols?

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 20 Jun 2013 @ 4:14pm

      Re:

      Ve vill not be zatisfied until ve reach 100 perzent! Heil de New Verld Order!

      link to this | view in chronology ]

  • icon
    John85851 (profile), 20 Jun 2013 @ 3:49pm

    Because blaming Google is easy

    Obviously it's easier for politicians to deal with this issue by telling Google to simply take down the images (if they even could) rather than go after the websites hosting the images (which is a crime) and the people producing the images (which is an even bigger crime). I would think politicians would be thanking Google for acting as a billboard directing law-enforcement directly to the criminal sites.

    But, nope- as usual, they think it's better to put a band-aid on the issue to cover it up rather than actually dealing it, because dealing with it is hard. Plus, tracking down the creators and owners of the websites takes time and the arrests could occur when the next DA takes office, so it'll count on his record.

    And how many of these websites are located in Russia, China, or some other country where the UK politicians can't easily arrest someone? It's much easier to go after a large US corporation which can be "persuaded" to cooperate under threat of not being able to do business in the country.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Jun 2013 @ 1:55am

      Re: Because blaming Google is easy

      Obviously it's easier for politicians to deal with this issue by telling Google to simply take down the images

      When you are in a position to be able to order someone to do something, without having to tell them how, all problems are easy to solve. When asked how, you can simply say that is up to the questioner to find the solution.

      Failure to solve the problem is not your problem, but rather a failure or people to do what you told them to do.
      Further people will report that they have done as requested, when they have passed the problem down the food chain.

      This is how large bureaucracies end up lying to themselves.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 21 Jun 2013 @ 2:00am

    There is a simple reason that it's impossible to get rid of Child Porn completely. AFAIK, the legal definition is porn of a minor. That is to say, somebody under the age of consent. The age of consent is different in different countries. Ergo, a picture could be legal porn in one country, but illegal child porn in another.

    link to this | view in chronology ]

  • icon
    th (profile), 23 Jun 2013 @ 8:25pm

    I would guess that stat is about right

    It's way way too easy to click on a link in say Tumblr and come across an image that makes you back out as quickly as possible, as if you'd stepped on a hot coal. I am not sure if they're actually classifiable as kiddie porn but that seems to be the effect they're going for and anyway I don't stay and ponder the issue either...they're clearly pictures of people of questionable age or made to look as such. Who needs these landmines laying about ? Good riddance.

    As far as reporting them goes, that would require me to look at them for longer than it takes to find my browser's back button. The general fear I think is that a crusading AG (I am in the US) would see you as an easy target since in theory at least that image could be cached somewhere on your computer, thus you *have* it and thanks for reporting yourself , sucker. Who wants to invite that wolf top your door? Who needs the feds kicking down your door at 3 a.m. ransacking your house, filing charges even if it all gets sorted out later? Try *rehabilitating* yourself after something like that.



    The lack of reporting represents one very dysfunctional thing, I am quite sure. It's silent testimony to people's lack of faith in their attorneys general's sincerity, trustworthiness, honest intentions and good judgement. Maybe some of them have common sense, but what if you have one who doesn't? How do you know? In the US more than a few appear to be careerist opportunists and even likely sociopaths who have wormed their way into positions of power and will take any innocent, low hanging fruit they can get, charge the shit out of it, force a plea bargain on the properly terrorized citizen then use the whole affair in their next election commercial in order to show that they're "tough on crime".

    I say go Google go. I wish Tumblr and some of the other mainstream image sharing sites would police themselves a lot better. Are you really saying it costs too much or you're worried about *free speech*? Give me a break. These sites who are raking in cash from their free user generated content, if they had a freaking conscience, would jump over each other to get a the chance to target this crap for the bit bin and just eat the cost, figuring that along with making money, you're on earth to do some good where you can.

    Guess they don't see things that way.

    link to this | view in chronology ]

    • icon
      nasch (profile), 24 Jun 2013 @ 8:40am

      Re: I would guess that stat is about right

      I am not sure if they're actually classifiable as kiddie porn... I wish Tumblr and some of the other mainstream image sharing sites would police themselves a lot better.

      You want them to police themselves against images that may be perfectly legal? Doesn't that sound problematic?

      link to this | view in chronology ]

  • identicon
    Beauty in the breakdown, 30 Jul 2013 @ 4:42pm

    I won't lie. I'm into kink Porn. It's kinda my thing. Been watching it for years. Today I stumbled across a new site when I clicked on it up popped CP. I was so disgusted I closed it right away. In hind sight I wish I hadn't bc then I could have reported. Anyway point being its out there! Never been a problem or happened before now I don't even want to turn my computer back on.

    link to this | view in chronology ]

  • identicon
    pedolove, 7 Aug 2014 @ 12:00am

    Child porn should obviously be completely decriminalized and if you disagree than you are an intolerant tyranical fuck who deserves to be thrown in a woodchipper.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.