Contrary To The Claims Of Grandstanding Politicians, Child Porn Is Very Difficult To Stumble Onto Accidentally
from the so-omnipresent-hardly-anyone-ever-sees-it dept
Google has decided to be even more "proactive" in fighting child pornography, crafting a database of flagged images that will be made available to law enforcement, investigators and even its own competitors. Somehow the company plans to make it searchable while simultaneously deleting the offending items from the web.There's no reason for Google to be doing this other than as a response to the UK government's consensus that Google = Internet, and is therefore responsible for policing everything it crawls. Unfortunately, many offending images will remain beyond the reach of Google. Additionally, turning a hunt for child porn into an algorithmic search will lead to false positives and deletions, as anyone familiar with ContentID and YouTube can readily attest.
The politicians crusading for a child porn-free internet will be satiated. Google's new offensive plays to their strengths, namely:
1. Proclaiming something must be done.
2. Allowing someone else to do that "something."
The UK's current porn-blocking efforts (of regular, legal porn) are a comedy of errors. "Child safety" filtering on mobile networks has already resulted in the mistaken blocking of YouTube, Orange, and The Jargon File. With these filters becoming mandatory next year, more and more sites will find themselves cut off from their users due to the general ineptness of blocking software crafted at the behest of hand-wringing bureaucrats.
Child porn, however, remains the true enemy, especially in Britain, where its profile is heightened due to recent events. In the oft-echoed call for someone (namely, Google) to do something about child porn, a rather startling statistic was quoted. According to the Internet Watch Foundation (IWF -- an industry-funded group that compiles lists of keywords and illegal abuse sites for subsequent banning by Google, et al), "more than 1.5 million internet users in the UK mistakenly viewed child abuse images last year." (Only 40,000 were reported to the IWF, a point which is left open to speculation.)
It's a rather alarming number. But is it accurate? UK website Ministry of Truth went digging into the math behind this "statistic." The "1.5 million" quote above was pulled from an IWF press release that offered no citations. Perusing the IWF's site itself, MoT found another press release that applied a bit of hedging to the claim.
New study reveals child sexual abuse content as top online concern and potentially 1.5m adults have stumbled upon it.Note that one word that changes everything.
Hang on a second, we’ve just gone from “1.5 million adults have stumbled across” child porn to “potentially 1.5 million adults have stumbled upon it”, which rather starts to suggest that the IWF’s “study” might not be quite what they’re making it out to be and, sure enough, a little further down the page we hit paydirt:Long story short (although the long story is a very interesting read), the poll used skewed demographics (weighted heavily towards the 55-and-older set) to produce this meaningless percentage:The ComRes poll conducted among a representative sample of 2058 British adults for the Internet Watch Foundation (IWF) shows the vast majority of people in Britain think that child sexual abuse content (“child pornography”) (91%) and computer generated images or cartoons of child sexual abuse (85%) should be removed from the internet.
Riiiiight… so it’s not actually a study, it’s an opinion poll; a grade of evidence that generally sits just above the story you heard from a bloke down the pub who swears blind that his cousin’s boyfriend knows a bloke who knows the bloke that it actually happened to.
"- 3% have seen/encountered 'Child pornography'"The problem with accepting this at face value (and then attaching it to multiple press releases) are numerous. For starters, as many as 1 in 7 UK citizens have never used a computer, much less have internet access. For another, one person's "child porn" is another person's "adult film starring consenting, paid adults." One needs look no further than the Daily Mail's disastrous attempt to show how easy it was to find child porn simply by using the same search terms as those found on a convicted child killer's internet history.
According to the 2011 Census the adult population of Great Britain is just over 48.1 million and 3% of that is a little under 1.43 million people, which the IWF has rounded up to 1.5 million (ignoring the usual rules on rounding) for its press releases.
The Mail's Amanda Platell claimed to have taken a journey to the "hell known as internet child porn." Unfortunately, her only souvenir from the trip was a misidentified clip from a 13-year-old (adult) porn film. True, the content of the film would be repulsive to many (simulated sexual assault), but the film was made and distributed legally.
Not only are the number of "potential" child porn viewers lower than the IWF claims, but the number of readily accessible pages containing child porn images on the internet is more "rounding error" than panic-worthy.
Here are the numbers the IWF came up with in its 2012 report.
In total, the IWF found 9,550 web pages that hosted child sexual abuse content spread across 1,561 internet domains in 38 different countries. 60% of the child sexual abuse content identified by the IWF was found on ‘one click hosting website’, i.e. a file hosting service/cyberlocker which, for reasons known only to itself, the IWF insists on referring to as a ‘web locker’ despite the fact that no else else seems to use that particular phrase.A brief glance at that total should readily tell you the percentage is insignificant. And this is a number compiled by a group tasked with hunting down child pornography, an entity that would have a much higher hit rate than the average person browsing the web. Here's how it stacks up to the whole of the internet.
Out of an estimated 14,8 billion indexed web pages, the British public reported just 9,696 web pages (0.000065%) containing child pornography to the IWF in the whole of 2012.How hard would it be to access child porn if you weren't looking for it specifically? The Ministry of Truth puts your odds at 1 in 2.6 million searches. (MoT points out the odds will fluctuate depending on search terms used, but for the most part, it's not the sort of thing someone unwittingly stumbles upon.)
In that same year, just 1561 internet domains (0.001%) were reported to the IWF that were found to contain child pornography out of a minimum of 145.5 million registered domains (and that’s just for five gTLDs and one country specific domain).
In fact, on a single ordinary day in May 2013, 92 times as many new domains were registered across just the six TLDs we have figures for, than were reported and found to be hosting child porn by members of the UK general public in the whole of 2012.
All those demanding Google do more to block child porn fail to realize there's not much more it can do. The UK already has an underlying blocking system filtering out illegal images at the ISP level, and Google itself runs its own blocker as well.
The above calculations should put the child porn "epidemic" in perspective. As far as the web that Google actively "controls," it's doing about as much as it can to keep child porn and internet users separated. There are millions of pages Google can't or doesn't index and those actively looking for this material will still be able to find it. Google (and most other "internet companies") can't really do more than they're already doing already. But every time a child pornography-related, high profile crime hits the courtroom (either in the UK or the US), the politicians instantly begin pointing fingers at ISPs and search engines, claiming they're not doing "enough" to clean up the internet, something that explicitly isn't in their job description. And yet, they do more in an attempt to satiate the ignorant hunger of opportunistic legislators.
If Google is "the face of the internet" as so many finger pointers claim, than the "internet" it "patrols" is well over 99% free of illegal images, according to a respected watchdog group. But accepting that fact means appearing unwilling to "do something," an unacceptable option for most politicians.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: child porn, grandstanding, politicians
Reader Comments
Subscribe: RSS
View by: Time | Thread
That's not how you deal with child porn. You get the police to do their job, infiltrate in the networks where it's distributed and make hell break loose.
But something must be done. Except that we want to make the least effort possible and we are not really worried if it will be effective, eh?
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: You seem disturbingly expert on the subject
[ link to this | view in chronology ]
Re: Re:
I think we can guess why so many opinions stated here by ACs are some obviously wrong if passing familiarity with a subject is suspicious...
[ link to this | view in chronology ]
Re:
Don't forget that many countries have criminalized drawings of underage sex as well as photos. So even manga is no longer safe.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re:
Pretty sure it used to be 16 but went up, so stuff that used to be legal now isn't.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
The argument is I believe that "looking at drawings might lead to a thirst for more 'real' things". No-one ever seems to consider or even acknowledge the flip-side to that coin where drawings might assuage the urge and prevent something actually bad...
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
Which are actually legal to do. Makes perfect sense. *facepalm* I think a more compelling (less nonsensical) argument would be that it's OK for a 16 year old to have sex with an adult, but that society (and the individuals) is better off if 16 year olds aren't working in porn or prostitution. That is, it's not worse to look at a picture of a naked 16 year old than it is to have sex with one, but it is worse to pay a 16 year old to take his or her clothes off than to do it for non-monetary reasons. That sort of sidesteps the issue of naked photos for fun, though, whether selfies or taken by a boyfriend or what have you.
[ link to this | view in chronology ]
Contrary to popular belief, paedophiles are not getting this material from Google. They get it from the darknet via peer to peer sharing networks; and they use techniques to hide what they are doing. Anything Google (or even ISPs, for that matter) do will be totally ineffectual.
The problem of child porn will not be solved with a technological solution because it is a social problem and, as such, requires a social solution.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
Now they are the willing masochist that everyone beats on.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
I'm not so sure...
Yeah. A bit like your wife finding a pile of singles and an ATM receipt from a gentleman's club balled up in your pocket and suddenly that kind of thing apparently appear's in pockets all the time.
[ link to this | view in chronology ]
Re: I'm not so sure...
* I think she pretended to be fooled and I pretended it was a good excuse but that's another story.
EVIL CACHES, SUE MOZILLA FOR ALL THE CHILD PORN.
[ link to this | view in chronology ]
Re: Re: I'm not so sure...
[ link to this | view in chronology ]
Re: Re: Re: I'm not so sure...
[ link to this | view in chronology ]
Re: I'm not so sure...
[ link to this | view in chronology ]
Re: Re: I'm not so sure...
That was Pete Townshend's excuse, and it didn't work out so well for him, either.
[ link to this | view in chronology ]
Easy as . . .
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
realy is it soo hard
/
computer generated images or cartoons of child sexual abuse
2(toes not take a sec to find stuff) is ewen easier use google tipe lol con shearch use google corecsion pics /
1.3 use tgp press reapeadetly pics that open to oder legal tgp's and you may soon be in illigal waders no looking needed
/
1.4 find a rank list japans and you have acsess to all gainda stuff
/
1.5look for tinami.com use link surf ower links
-
+ getchu
http://www.google.com/search?client=opera&q=getchu&sourceid=opera&ie=utf-8&oe=utf- 8&channel=suggest
dlsite/
Cosplay (105)
Fantasy (799)
Heartwarming (1004)
Hilarious (1018)
Magical Girl (203)
Moe (332)
Nekomimi (Catgirl) (176)
Puni (195)
Robot (124)
School (461)
SF (256)
Uniform (571)
Yaoi (189)
Yuri/Girls Love (196)
/
and so on and on
if you stuppid you simply look for stuff if you have brains you have vpn
[ link to this | view in chronology ]
Not to overlook that just about every Internet using UK citizen is totally hating them for doing this.
Had they really wanted to attack child porn then they are over a decade too late to join that party. The Internet population did choose to self censor where with the aid of law enforcement the Web was cleaned. So let us be clear what they find now namely one questionable image on an otherwise adult site where even that image is simply a lawful age model who simply looks younger than her true age.
So the home of CP now are the dark nets like Tor
[ link to this | view in chronology ]
Re:
So the home of CP now are the dark nets like Tor Core but even there you wont find CP unless you go through many sites and links in an active hunt for it.
I have seen an UK adult site filter in action before on a mobile SIM but I gave up all hope on it when many adult sites were not blocked because they were unknown to them. Then many innocent sites did get blocked like the ASCII archive. Block ASCII art really?
So this mandatory policy only makes the situation worse by fooling parents into a false sense of security.
To top this off then it is well known that easy porn access has led to a large drop in sexual crimes. So to save us from standard porn would mean more people getting molested and raped.
Then all this in a country where national news papers read by all ages get women naked proving to all what topless women look like.
[ link to this | view in chronology ]
Re: Re:
This is an interesting side note.. I'd love to know of any evidence to support this?
[ link to this | view in chronology ]
[ link to this | view in chronology ]
What is child porn?
I will say that, if you search for porn a lot, it is not hard to stumble across an image of child porn, especially if we are talking softcore "porn". Although I have not tried (for obvious reasons -how many people are really willing to test the truth of the alarmist's assertions?), my sense is it is rather difficult to find large quantities of "quality" CP.
A very great danger is the use of CP images to blackmail/shakedown people, including people in positions of power. It is all too easy for a few images of softcore borderline porn to get mixed in with legal material. Its far to easy to plant this kind of stuff on someone's computer. That is why possession of CP should be decriminalized. Not legalized, it would still be contraband, and possession would be no more than a violation. The distinction between hardcore and softcore should be taken into account as well. Distribution and obviously production would still be crimes of graduating severity.
[ link to this | view in chronology ]
false positves not a problem
http://googleblog.blogspot.com/2013/06/our-continued-commitment-to-combating.html
It is hard to tell from this what exactly Google has done on its own in this effort apart from providing money, software and hardware to other groups trying to identify and maybe filter out child porn from the internet.
A few basics. Google uses a database of hashes to identify copies of known child porn images. It is not clear that Google itself has added to the databases that have been created through law enforcement efforts. The hashes traditionally have used MD5 which has cryptographical weaknesses related to someone designing a file that, when hashed, will match a target hash value. This is still very hard to do so the problem of false positives is pretty much nonexistent. This issue may become important if there is an attempt to use hash values of encrypted files in court to prove possession of child porn. I must emphasize that with a 16 byte (128 bit) hash collisions are exceedingly unlikely. Unlike the algorithms used for ContentId there will not be a problem with false positives.
The Google blog says:
"Recently, we’ve started working to incorporate encrypted “fingerprints” of child sexual abuse images into a cross-industry database."
I believe that should be corrected to say that they are incorporating fingerprints of encrypted child sexual abuse images into a cross-industry database. That is, they are trying to identify encrypted child porn by hashing those files as well. This is potentially useful but assumes that pedophiles don't re-encrypt images for storage or further distribution.
It is not clear, from the blog, whether Google actually filters out search results for cp files, or web pages containing such files. Nor is it clear that Google uses any other methods to identify such files (e.g. searching gmail accounts for matches).
It is fairly easy to defeat identification through a hash database by altering the image file in absolutely any minimal way. Apparently, law enforcement has had pretty good success in identifying known porn images through this method, so the usual conventional wisdom that most criminals are stupid seems to be true. My impression of Google's blog is that it is a feel-good PR piece that makes it seems like Google is doing a lot when it really isn't. Not that they should have to. Law enforcement should welcome the status quo as they can identify and track pedophiles because they are using the internet to exchange files that are not uniquely encrypted. If such exchanges become impossible due to filtering from companies such as Google then exchanges will be pushed further underground with the use of cryptography so that even law enforcement will have a hard time identifying and tracking pedophiles.
[ link to this | view in chronology ]
So, how high is the crime rate of this "britain" the uk political corpus patrols?
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Because blaming Google is easy
But, nope- as usual, they think it's better to put a band-aid on the issue to cover it up rather than actually dealing it, because dealing with it is hard. Plus, tracking down the creators and owners of the websites takes time and the arrests could occur when the next DA takes office, so it'll count on his record.
And how many of these websites are located in Russia, China, or some other country where the UK politicians can't easily arrest someone? It's much easier to go after a large US corporation which can be "persuaded" to cooperate under threat of not being able to do business in the country.
[ link to this | view in chronology ]
Re: Because blaming Google is easy
When you are in a position to be able to order someone to do something, without having to tell them how, all problems are easy to solve. When asked how, you can simply say that is up to the questioner to find the solution.
Failure to solve the problem is not your problem, but rather a failure or people to do what you told them to do.
Further people will report that they have done as requested, when they have passed the problem down the food chain.
This is how large bureaucracies end up lying to themselves.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
I would guess that stat is about right
As far as reporting them goes, that would require me to look at them for longer than it takes to find my browser's back button. The general fear I think is that a crusading AG (I am in the US) would see you as an easy target since in theory at least that image could be cached somewhere on your computer, thus you *have* it and thanks for reporting yourself , sucker. Who wants to invite that wolf top your door? Who needs the feds kicking down your door at 3 a.m. ransacking your house, filing charges even if it all gets sorted out later? Try *rehabilitating* yourself after something like that.
The lack of reporting represents one very dysfunctional thing, I am quite sure. It's silent testimony to people's lack of faith in their attorneys general's sincerity, trustworthiness, honest intentions and good judgement. Maybe some of them have common sense, but what if you have one who doesn't? How do you know? In the US more than a few appear to be careerist opportunists and even likely sociopaths who have wormed their way into positions of power and will take any innocent, low hanging fruit they can get, charge the shit out of it, force a plea bargain on the properly terrorized citizen then use the whole affair in their next election commercial in order to show that they're "tough on crime".
I say go Google go. I wish Tumblr and some of the other mainstream image sharing sites would police themselves a lot better. Are you really saying it costs too much or you're worried about *free speech*? Give me a break. These sites who are raking in cash from their free user generated content, if they had a freaking conscience, would jump over each other to get a the chance to target this crap for the bit bin and just eat the cost, figuring that along with making money, you're on earth to do some good where you can.
Guess they don't see things that way.
[ link to this | view in chronology ]
Re: I would guess that stat is about right
You want them to police themselves against images that may be perfectly legal? Doesn't that sound problematic?
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]