Cloudflare Makes It Easier For All Its Users To Help Stop Child Porn Distribution

from the this-is-good dept

We recently wrote about how Senators Lindsey Graham and Richard Blumenthal are preparing for FOSTA 2.0, this time focused on child porn -- which is now being renamed as "Child Sexual Abuse Material" or "CSAM." As part of that story, we highlighted that these two Senators and some of their colleagues had begun grandstanding against tech companies in response to a misleading NY Times article that seemed to blame internet companies for the rising number of reports to NCMEC of CSAM found on the internet, when that should be seen as more evidence of how much the companies are doing to try to stop CSAM.

Of course, working with NCMEC and other such organizations takes a lot of effort. Being able to scan for shared hashes of CSAM isn't something that every internet site can do. It's mostly just done by the larger companies. But last week Cloudflare (one of the companies that Senators are demanding "answers" from), did something quite fascinating: it enabled all Cloudlfare users, no matter what level of service, to start using Cloudflare CSAM scanning tools for free, even allowing them to set their own rules and preferences (something that might become very, very important if the Graham/Blumenthal bill becomes the law.

I highly recommend reading the entire article, because it's quite a clear, interesting, and easy to read article about how fuzzy hashing works (including pictures of dogs and bicycles). As the Cloudflare post notes, those who use such fuzzy hashing tools have intentionally kept at least some of the details secret -- because being too public about it would allow those who are producing and distributing CSAM to make changes that "dodge" the various tools and filters, which would obviously be a problem. However, that also results in two potential issues: (1) a lack of transparency in how these filtering systems really operate and (2) an inability for all but the largest players to make use of these tools -- which would be disastrous for smaller companies if they were required to make use of such things.

And that's where Cloudflare's move is quite interesting. In providing the tool for free to all of its users, it keeps the proprietary nature of the tool secret, but it's also letting them set the thresholds.

If the threshold is too strict — meaning that it's closer to a traditional hash and two images need to be virtually identical to trigger a match — then you're more likely to have have many false negatives (i.e., CSAM that isn't flagged). If the threshold is too loose, then it's possible to have many false positives. False positives may seem like the lesser evil, but there are legitimate concerns that increasing the possibility of false positives at scale could waste limited resources and further overwhelm the existing ecosystem. We will work to iterate the CSAM Scanning Tool to provide more granular control to the website owner while supporting the ongoing effectiveness of the ecosystem. Today, we believe we can offer a good first set of options for our customers that will allow us to more quickly flag CSAM without overwhelming the resources of the ecosystem.

Different Thresholds for Different Customers

The same desire for a granular approach was reflected in our conversations with our customers. When we asked what was appropriate for them, the answer varied radically based on the type of business, how sophisticated its existing abuse process was, and its likely exposure level and tolerance for the risk of CSAM being posted on their site.

For instance, a mature social network using Cloudflare with a sophisticated abuse team may want the threshold set quite loose, but not want the material to be automatically blocked because they have the resources to manually review whatever is flagged.

A new startup dedicated to providing a forum to new parents may want the threshold set quite loose and want any hits automatically blocked because they haven't yet built a sophisticated abuse team and the risk to their brand is so high if CSAM material is posted -- even if that will result in some false positives.

A commercial financial institution may want to set the threshold quite strict because they're less likely to have user generated content and would have a low tolerance for false positives, but then automatically block anything that's detected because if somehow their systems are compromised to host known CSAM they want to stop it immediately.

This is an incredibly thoughtful and nuanced approach, recognizing that when it comes to any sort of moderation, one size can never fit all. And, by allowing sites to set their own thresholds, it actually does add in a level of useful transparency, without exposing the inner workings that would allow bad actors to game the system.

That said, I can almost guarantee that someone (or perhaps multiple someones) will come along before too long and Cloudflare's efforts to help all of its users combat CSAM will somehow be incorrectly or misleadingly spun to claim that Cloudflare is somehow helping sites to hide or enable CSAM. No good deed goes unpunished.

However if you want to support actual solutions -- not grandstanding nonsense -- to try to deal with CSAM, approaches like Cloudflare's are ones worth paying attention to. This is especially true if Graham/Blumenthal and others get their way. Under proposals like the one they're suggesting, it will become virtually impossible for smaller companies to take the actions necessary to meet the standards to avoid legal liability. And that means that (once again) the big internet companies will end up getting bigger. They all have access to NCMEC and the necessary tools to scan and submit CSAM. Smaller companies don't. Cloudflare offering up its scan tool for everyone helps level the playing field in a really important way.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: child porn, csam, fuzzy hashing, infrastructure, tools
Companies: cloudflare


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    Stephen T. Stone (profile), 24 Dec 2019 @ 11:22am

    The politicans will say “well, this is a start” and keep pushing for those CSAM laws anyway. Why? Because they believe the nerds can nerd harder and stop all CSAM from being posted at all, even on the Dark Web.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 24 Dec 2019 @ 11:32am

    With the way the wind is blowing, Cloudfare are setting themselves up, deliberately or accidentally, to be the Internet filter, especially once the Copyright cartel send in their enforcers.

    link to this | view in thread ]

  3. icon
    ysth (profile), 24 Dec 2019 @ 12:43pm

    "Cloudflare offering up its scan tool for everyone"

    There is still a subtle distinction between "everyone" and "Cloudflare ustomers" :)

    link to this | view in thread ]

  4. identicon
    SA C.P. Distributor, 24 Dec 2019 @ 7:24pm

    if you CSAMthing...

    Well, if you CSAMthing, say nothing, because all those its for the children® people will always find a new way to exploit your good will to pass more speech crushing laws; or target you with their online hoodlums.

    The irony of course is that its those exact people and especially their toxic NGOs that are actually distributing CSAM, all over the globe, are what CSAM distribution networks are.

    NCMEC is not just some innocent bystander, because they (like every other NGO involved in this issue) have an internet warehouse FULL of CSAM that they lend to local police forces around the world to use in black operations of all kinds.

    CSAM was virtually non-existent until the CIA imported Nazi svengalis during the Op Paperclip/MKultra era, and then there was an explosion of it, which continues today, because the governments and NGOs actually use CSAM as a weaponized form of propaganda/compromise/blackmail tool.

    This was most clearly seen in the use by British Mi5-6/JTRIG, where they used it to target ME terrorists, but we also see it coming from Israeli and other private contractors too.

    And of course, the MSM, ranging from the toxic Nicholas Kristoff, to the other CIA/FBI mockingbirds are highly invested in not reporting about these topics in a factual manner.

    Ultra conservatives all donate their god-dollars to these NGOs, and even the UN is involved with their empowerment programs, and in turn these NGOs are interlinked around the world, and quite notoriously, the AU, Swedish, British, and other intel agencies distribute CSAM for year long stretches in a catch-me-if-you-can pattern of malicious government conduct.

    And these are just the open secrets about CSAM and the actual distribution networks around the world that can be found by reading the news.

    So, NCMEC, etc., none of them have clean hands in this issue, and in fact, and practice are regularly found to be actual distributors.

    link to this | view in thread ]

  5. This comment has been flagged by the community. Click here to show it
    identicon
    R/O/G/S, 24 Dec 2019 @ 10:51pm

    The NYT piece has some great infographics explaining the fuzzy hash-but nothing about explaining the fuzzy logic of why it demonizes fathers repeatedly by holding up the extreme and horrific monsters that are actually radical outliers as sexual abusers while completely exonerating women and mothers and other female pedophiles by default, despite this quote:

    Another image, found in September 2018, depicts a woman [blank blanking the blank] of a naked 2-year-old girl. Google declined to take down the photo, stating in an email to the Canadian analysts that while it amounted to pedophilia, “it’s not illegal in the United States.”

    That one quote highlights the gender agenda of both the writer, and the massive accountability gap between how we perceive child abuse when its a female perpetrator, because our entire cultural construct is flawed.

    This complicity with females who sexually abuse kids, or who minimize insight into this forbidden topic has been going on since the 1980s porn wars, and continues today, as radical elements of the left continues to condone attrocious mothering, and even child abuse BY women, as long as it furthers the DVIC agenda of “owning the sex supply”.

    Then, the article goes on about foster parents, blahblah blah, while making the not so subtle case that massive spying is ok, as long as the Good Corporations® full of Good Men, and Better Women® do it, because, of course, we can trust Cloudflare, and Peter Thiel, and Mark Zuckerberg, and a few lesbian gender warriors from a Toronto CSAM clearinghouse with troves of child pornography.

    Sure...seems legit....

    Pure, distorted, American/ western false moral imperatives, and none of it has or ever will help those children heal.

    From the family court to foster care to prison pipeline, some 70% of kids might be sexually exploited by predators(and statistics bear that out), but 100%
    of them will definitely be exploited by people who work in DVIC industries, using those poor babies in moral panics, and getting a paycheck for doing it, while never “solving” the problem of child abuse, or CSAM, precisely because it is a self-perpetuating business model.

    https://www.hg.org/legal-articles/sexual-abuse-an-epidemic-in-foster-care-settings-6703

    And the presence of religion prone, middle aged, withered white women at the center of the dialogue is kind of hard to miss, considering that they are also prime suspects since forever in waging religious tolerance movements, and moral panics that have preceded many, many genocides too.

    link to this | view in thread ]

  6. identicon
    Rekrul, 25 Dec 2019 @ 1:38am

    If the threshold is too strict — meaning that it's closer to a traditional hash and two images need to be virtually identical to trigger a match — then you're more likely to have have many false negatives (i.e., CSAM that isn't flagged). If the threshold is too loose, then it's possible to have many false positives.

    One only has to search for an image on Google and look at the "visually similar" pictures it suggests to see how well computers are at choosing similar photos.

    link to this | view in thread ]

  7. icon
    Stephen T. Stone (profile), 25 Dec 2019 @ 6:02am

    Cool story, bro.

    Go submit it to an MRA forum.

    link to this | view in thread ]

  8. This comment has been flagged by the community. Click here to show it
    identicon
    R/O/G/S, 25 Dec 2019 @ 11:21pm

    Re: rights are rights, for everyone

    Of course Stephen T. Stone is a child rapist apologist, but only when its female child rapists like his mommy.

    Coincidentally, hes also a fan of enemas and anal suppositories too.

    Any thoughts on the actual article above, or are you just here as a DVIC douchebag ?

    [writer wonders aloud “Is this Stone guy part of the medical community that exploits, and profiteers off of the suffering of these children?]

    Mother who boy through 13 medically unnecessary surgeries sentenced to prison

    https://nypost.com/2019/08/18/texas-mom-put-healthy-son-through-13-unnecessary-surgeries/

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 26 Dec 2019 @ 3:52am

    Re:

    Not just a Nazi bitch, but a kink shamer as well. I didn’t think you could get much lower then sniffing your dads shit stained underwear bro. But here you are proving me wrong.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 26 Dec 2019 @ 9:16am

    Cloudflare is to the internet as a damn is to a flowing river.

    Ever since Cloudflare hit the internet, it's been slowly eroding it by pretending it's not responsible for the actions of its users.

    Before anyone chimes in and defends this bullshit, just remember one thing: Snowden released documents upon documents regarding to how the NSA spied on Americans and the world.

    Is there a particular reason you never once asked the question of where the software came from.

    link to this | view in thread ]

  11. identicon
    Mel Feasans, 26 Dec 2019 @ 1:54pm

    Hope springs eternal.

    "It will become virtually impossible for smaller companies to take the actions necessary to meet the standards to avoid legal liability. And that means that (once again) the big internet companies will end up getting bigger."

    I keep wondering; whenever I see articles that discuss legislative efforts concerning the internet, if anyone will ever come to the simple realization that making the internet less safe for citizens and making internet mega-corpse bigger is actually the true purpose behind 99% of such legislative proposals.
    Oh well. Perhaps some day.

    link to this | view in thread ]

  12. identicon
    R/O/G/S, 28 Dec 2019 @ 2:32am

    Re: Re:We are all kinked, but ur a kook

    You are one bizarre, flaming zionazi hemorhoid, and a child abuse apologist to boot.

    There is nothing wrong with consensual kink, but I draw the line where you stand: equating sexual abuse of boys and other children by females as a “kink.”

    Unfortunately, many abused children later find vouce in fetish communities and kink, too, while never addressing child sexual perpetrated abuse by females or others.

    Go kill yourself already.

    Just do it,

    bro

    link to this | view in thread ]

  13. identicon
    AC Liberation Front, 28 Dec 2019 @ 3:21am

    the most censored thought on the internet

    LIBERATE FEMALE PEDOPHILES AND CHILD ABUSERS NOW!©

    LIBERATE STEPHEN STONE NOW!©

    The NYT piece has some great infographics explaining the fuzzy hash-but nothing about explaining the fuzzy logic of why it demonizes fathers repeatedly by holding up the extreme and horrific monsters that are actually radical outliers as sexual abusers while completely exonerating women and mothers and other female pedophiles by default, despite this quote:

    Another image, found in September 2018, depicts a woman [blank blanking the blank] of a naked 2-year-old girl. Google declined to take down the photo, stating in an email to the Canadian analysts that while it amounted to pedophilia, “it’s not illegal in the United States.”

    That one quote highlights the gender agenda of both the writer, and the massive accountability gap between how we perceive child abuse when its a female perpetrator, because our entire cultural construct is flawed.

    This complicity with females who sexually abuse kids, or who minimize insight into this forbidden topic has been going on since the 1980s porn wars, and continues today, as radical elements of the left continues to condone attrocious mothering, and even child abuse BY women, as long as it furthers the DVIC agenda of “owning the sex supply”.

    Then, the article goes on about foster parents, blahblah blah, while making the not so subtle case that massive spying is ok, as long as the Good Corporations® full of Good Men, and Better Women® do it, because, of course, we can trust Cloudflare, and Peter Thiel, and Mark Zuckerberg, and a few lesbian gender warriors from a Toronto CSAM clearinghouse with troves of child pornography.

    Sure...seems legit....

    Pure, distorted, American/ western false moral imperatives, and none of it has or ever will help those children heal.

    From the family court to foster care to prison pipeline, some 70% of kids might be sexually exploited by predators(and statistics bear that out), but 100%
    of them will definitely be exploited by people who work in DVIC industries, using those poor babies in moral panics, and getting a paycheck for doing it, while never “solving” the problem of child abuse, or CSAM, precisely because it is a self-perpetuating business model.

    https://www.hg.org/legal-articles/sexual-abuse-an-epidemic-in-foster-care-settings-6703

    And the presence of religion prone, middle aged, withered white women at the center of the dialogue is kind of hard to miss, considering that they are also prime suspects since forever in waging religious tolerance movements, and moral panics that have preceded many, many genocides too.

    link to this | view in thread ]

  14. identicon
    R/O/G/S, 28 Dec 2019 @ 3:58am

    Re: Re:

    Listen, DonkeyHotep, I dont know what youre rambling about, but for a Hotep with such big ears, you are one tone deaf jackass.

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 28 Dec 2019 @ 8:07pm

    Cloudflare will automatically send a notice to you when it flags CSAM material, block that content from being accessed (with a 451 “blocked for legal reasons” status code), and take steps to support proper reporting of that content in compliance with legal obligations.

    Sadly, that means that the webmaster who enables this is only inviting unwanted attention from the feds the moment some random user uploads something inappropriate. Not only does every trigger of the CSAM flag (false-positive or otherwise) bring the site one step closer to being shut down by Cloudflare, it also brings the site one step closer to becoming a target of a federal fishing expedition... like the one that brought down Backpage.

    Backpage used to like to rat users out to NCMEC. Eventually, they realised the price of filing too many of those reports was that investigations were directed not against the offending users but against Backpage itself. At some point, "shoot, shovel and shut up" looks really tempting when dealing with net.abuse as not every site has the resources to defend itself from abusive governments.

    link to this | view in thread ]

  16. icon
    travelsingh52 (profile), 12 Nov 2020 @ 12:55am

    vietnam cycling tours

    I think this is one of the best blog for me because this is really helpful for me. Thanks for sharing this valuable information for free...
    website: <a href="https://www.vietodyssey.com/destinations/vietnam/ban-gioc-waterfall">bangioc tours</a>

    link to this | view in thread ]

  17. icon
    travelsingh52 (profile), 12 Nov 2020 @ 12:56am

    bangioc waterfall

    Wonderful post and more informative! keep sharing
    website: https://www.vietodyssey.com/tours/cycling-tours-c402

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.