Researchers Who Built Similar System Explain Why Apple's CSAM Scanning System Is Dangerous

from the it's-not-good dept

Jonathan Mayer, a Princeton University professor and former chief technologist at the FTC, is one of the smartest people I know. Every time I've spoken with him I feel like I learn something. He's now written a quite interesting article for the Washington Post noting how he, and a graduate researcher at Princeton, Anunay Kulshrestha, actually built a CSAM scanning system similar to the one that Apple recently announced, which has security experts up in arms over the risks inherent to the approach.

Mayer and Kulshretha note that while Apple is saying that people worried about their system are misunderstanding it, they are not. They know what they're talking about -- and they still say the system is dangerous.

We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous. We’re not concerned because we misunderstand how Apple’s system works. The problem is, we understand exactly how it works.

Mayer and Kulshretha are certainly not making light of the challenges and problems associated with stopping CSAM. It's why they started their project to see if there was an effective way to identify CSAM even in end-to-end encrypted systems. And they even built a system to do just that. And what they found is that the risks are simply too great.

We were so disturbed that we took a step we hadn’t seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides. We’d planned to discuss paths forward at an academic conference this month.

That dialogue never happened. The week before our presentation, Apple announced it would deploy its nearly identical system on iCloud Photos, which exists on more than 1.5 billion devices. Apple’s motivation, like ours, was to protect children. And its system was technically more efficient and capable than ours. But we were baffled to see that Apple had few answers for the hard questions we’d surfaced.

The potential dangers that Mayer and Kulshretha are exactly what many had warned about when Apple announced its plans:

After many false starts, we built a working prototype. But we encountered a glaring problem.

Our system could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.

A foreign government could, for example, compel a service to out people sharing disfavored political speech. That’s no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials.

We spotted other shortcomings. The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny.

That's already pretty damning. But there's some other even more damning information that has come out as well. As we noted in our earlier posts (and as Mayer and Kulshretha noted in their research), the risk of false positives is extremely high. And late last week, the hypothetical became a lot more real. Someone reverse engineered the NeuralHash algorithm that Apple is using and put it on Github.

It did not take long for people to point that, first of all, there are collisions with totally unrelated images:

Now, you might say that since that second image is not really an image, maybe it's not that big a deal? Well, about that...

And, of course, one of the issues with any hash-based system is the idea that subtle changes to the image would output a totally different hash, thus making it easy for some to simply route around the scanning. Apple had suggested its system could defeat that, but...

This is part of the reason that we highlighted earlier that security researchers are so up in arms about this. Apple seemingly ignored so much of the research and conversations that were happening about these approaches, and just barged right in announcing that it had a solution without exploring the tradeoffs and difficulties associated with it -- leaving that for security experts to point out afterwards.

Apple is trying to downplay these findings, saying that it expected the collisions at least, and that it's system would also do a separate server side hashing comparison which would stop the false collisions. Though, as Bruce Schneier points out, if this was "expected," then why wasn't it discussed in the initial details that were released? Similarly, I have yet to see a response to the flip side issue of changing the images in a way that fool NeuralHash while still looking the same.

I know Apple keeps wanting to insist that it's thought through all of this, but it doesn't seem to have thought through any of how the security community would see all of this, and it's after-the-fact scrambling is not exactly reassuring.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: backdoors, client side scanning, csam, encryption, hash clash, jonathan mayer, surveillance
Companies: apple


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    MathFox, 23 Aug 2021 @ 9:48am

    They shouldn't avoid our email scanning

    From 9to5 Mac

    Apple has confirmed to me that it already scans iCloud Mail for CSAM, and has been doing so since 2019. It has not, however, been scanning iCloud Photos or iCloud backups.

    There is more in the article...

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 23 Aug 2021 @ 9:58am

    The big problem is that they have implemented a system, that has root access, and can exfiltrate anything they want from the phone. Scanning on the phone saves Apple CPU power, but is incidental to the fact that they have implemented a system that gives them full access and export for anything on the phone. That is not a backdoor, but rather full government access to phone contents, including keys as they are entered, as soon as a government can compel Apple to comply with their demands.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 23 Aug 2021 @ 10:20am

    Re:

    Quite the contrary. It is a backdoor, just one surrounded with neon lights, and illuminating arrows pointing right to it.

    Full government access is one thing, but this goes much further than that. Apple will happily give access to anyone who's willing to pay, so don't be surprised it you're called in front of your Employer's HR because you have something on your iPhone they didn't like.

    Don't count on "lawful-intercept" or wire-tapping laws to protect you either. If Apple hasn't already, you can be sure they'll updated iPhone terms and conditions, so they can claim that you have "voluntarily agreed" to this snooping.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 23 Aug 2021 @ 10:31am

    Apple says it won't give in to GOVERNMENT demands.

    Apple has already set up its own shell company. It plans to "sell" the data to the shell company, which will sell the data to various governments around the world.

    Thus keeping "the literal letter of the promise, but actually making a lot of money".

    The internal plans specify child porn as "the ice breaker" to get people used to being spied on, then they will start on emails, documents, passwords, browsing history one by one. A slow push back of privacy in the hopes the majority of customers won't notice.

    link to this | view in thread ]

  5. icon
    ECA (profile), 23 Aug 2021 @ 10:57am

    old idea

    There was an interesting thought long ago, and a debate that didnt last to long.
    If we all could read each others minds, wouldnt it be a great thing?
    We could understand each other fully and with meaning.
    And know every crook and thief around us.
    Learning to control ourselves and our minds would be the first challenge.
    We would know WHO needs help.
    We could understand and KNow how others feel about Special circumstances, maybe find out how to Adjust them to be abit different. But we would know what created things in their minds.

    But those most scared are the ones hiding truth.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 23 Aug 2021 @ 10:58am

    Hash

    I think the folks at Apple may be enjoying a different kind of Hash if they can claim that they actually knew the problems pointed out and still went forward with it.

    It's a lot like replacing a bridge with nothing and pointing out the the cars were able to drive off a cliff without any problems and it was fully expected.

    They should share that Hash.

    link to this | view in thread ]

  7. icon
    That One Guy (profile), 23 Aug 2021 @ 11:08am

    'You had One Job'

    A system for catching certain content that's trivial to trick/bypass, will result in a significant number of false positives and can be easily repurposed to go after any number of other types of content without the user's knowledge with the only thing stopping that being Apple pinky-promising not to do that and/or other groups(government or not) doing so themselves.

    Oh yeah, they really thought this one thought before rolling it out.

    link to this | view in thread ]

  8. icon
    Ninja (profile), 23 Aug 2021 @ 11:49am

    Considering it can be weaponized in many ways other than governments using it for censorship and the "for the children" mantra is often used by people that couldn't care less about the children or civil rights (ie: the Trump horde, dictators around the world etc) this will be very ugly to watch. Thanks Apple. And if I didn't already have enough reasons not to use Apple products now I have one that's probably worth all possible reasons multiplied by a few orders of magnitude.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 23 Aug 2021 @ 11:51am

    A statement from Tim Cook

    A statement from Tim Cook found in a interview with the Australian Financial Review about tech companies and privacy.

    "Technology will only work if it has people's trust."

    Guess you just through that in the Apple's dust bin.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 23 Aug 2021 @ 12:32pm

    Just another example of why policies don't mean anything if they have your data, or access to it.

    link to this | view in thread ]

  11. icon
    BernardoVerda (profile), 23 Aug 2021 @ 5:51pm

    Re:

    This technology is a policy, and as subject to changes and gaming as any other policy -- it's just that this policy in now implemented on the "user's" (we used to call him/her the "owner") own phone.

    link to this | view in thread ]

  12. icon
    Darkness Of Course (profile), 23 Aug 2021 @ 6:59pm

    Data center power usage model

    Google showed up at conference and presented their whizzy solution. Reduce the transformation of DC to AC to DC, and some other bits.

    Some got up in their face because there was an industry wide panel working on just that and they appeared to further down the road than Google.

    If you are absolutely convinced you are the best, then checking your results against the work of others will never occur to you.

    link to this | view in thread ]

  13. icon
    That Anonymous Coward (profile), 23 Aug 2021 @ 10:17pm

    From the company that told you that you were holding their phone wrong rather than admit a massive design failure...
    You expect them to admit their idea has any possible flaws?

    Something something how ARE humans still alive.

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 24 Aug 2021 @ 1:57am

    Re: Re:

    It's not a backdoor, Apple have wedged the front door open.

    link to this | view in thread ]

  15. icon
    BG (profile), 24 Aug 2021 @ 7:48am

    Re:

    ... and if official/government access was all we had to worry about that would be tolerable ... possibly.

    That's not all we have to worry about though. A system like this is 100% going to be targeted by adversarial intelligence services, criminal, hackers, etc. One way or another people that have no legitimate or court authorised right to the system will make use of it. Whether it's old fashioned bribery/blackmail, hacking, exploiting software flaws, etc. they will get access to it.

    link to this | view in thread ]

  16. icon
    nasch (profile), 25 Aug 2021 @ 6:58am

    Re:

    Apple has already set up its own shell company. It plans to "sell" the data to the shell company, which will sell the data to various governments around the world.

    Is this speculative fiction, or a thing that's actually happened? If the latter, where can we read about it?

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.