Apple's New Scanning Tools Raising More Concerns, Even Inside Apple

from the take-a-step-back dept

Last week, we wrote about our concerns about Apple's newly announced scanning efforts that the company claimed were to protect children. Lots of security experts raised concerns about how this was being rolled out -- and none of the complaints were meant to take away from the very real and legitimate concerns about child sexual abuse. Security guru Alex Stamos wrote one of the most thoughtful threads about the whole thing, noting (as with so many of these issues) that there are no easy answers here. I highly recommend you read the entire thread, but here's a little snippet:

Similar to many of these debates involving nuanced tech policy issues with no easy answers, a big part of the problem here is, as Stamos notes in his thread, there are tons of conversations happening about the nuances and tradeoffs, and even though Apple's approach is not as disastrous and dangerous as it could have been (i.e., clearly a lot of thought was put by the team at Apple into minimizing many -- though not all -- of the risks here), this approach was still done without talking to the many, many people who have been trying to find a reasonable balance here. And that messes a lot of stuff up.

Stamos, along with computer science professor/security guy Matt Green, have now published a good piece in the NY Times highlighting their concerns. The article notes there is less concern about the iMessage child safety features (Apple's initial description of that seemed much more concerning, but the details show why it's not that bad). But the photo scanning on the phone raises a lot of concerns:

But the other technology, which allows Apple to scan the photos on your phone, is more alarming. While Apple has vowed to use this technology to search only for child sexual abuse material, and only if your photos are uploaded to iCloud Photos, nothing in principle prevents this sort of technology from being used for other purposes and without your consent. It is reasonable to wonder if law enforcement in the United States could compel Apple (or any other company that develops such capacities) to use this technology to detect other kinds of images or documents stored on people’s computers or phones.

While Apple is introducing the child sexual abuse detection feature only in the United States for now, it is not hard to imagine that foreign governments will be eager to use this sort of tool to monitor other aspects of their citizens’ lives — and might pressure Apple to comply. Apple does not have a good record of resisting such pressure in China, for example, having moved Chinese citizens’ data to Chinese government servers. Even some democracies criminalize broad categories of hate speech and blasphemy. Would Apple be able to resist the demands of legitimately elected governments to use this technology to help enforce those laws?

Another worry is that the new technology has not been sufficiently tested. The tool relies on a new algorithm designed to recognize known child sexual abuse images, even if they have been slightly altered. Apple says this algorithm is extremely unlikely to accidentally flag legitimate content, and it has added some safeguards, including having Apple employees review images before forwarding them to the National Center for Missing and Exploited Children. But Apple has allowed few if any independent computer scientists to test its algorithm.

The computer science and policymaking communities have spent years considering the kinds of problems raised by this sort of technology, trying to find a proper balance between public safety and individual privacy. The Apple plan upends all of that deliberation. Apple has more than one billion devices in the world, so its decisions affect the security plans of every government and every other technology company. Apple has now sent a clear message that it is safe to build and use systems that directly scan people’s personal phones for prohibited content.

In a separate thread, Stamos has a suggested path forward for Apple, which involves pumping the brakes quite a bit on some of these features.

Meanwhile, Reuters revealed on Thursday that inside Apple there are widespread concerns as well.

Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.

Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.

The article notes that many of the concerns are coming from outside of the security team at Apple -- suggesting that the concerns are more about perception than they are technical. But, really, this highlights the same problem that Stamos noted earlier: Apple's standard operating procedure of doing everything alone, and then also doing "surprise" announcements regarding products. That's great for a new gadget in your pocket. It's not so great for dealing with a massively challenging and very legitimate problem with no easy answers, where getting things even a little wrong can have significant negative consequences.

Unlike many companies that rush out offerings that do more harm than good, I do think that Apple did think this through internally with lots of smart and thoughtful people. But these are problems and challenges that go beyond just one company -- and Apple's famously insular approach is exactly the wrong thing for this sort of challenge.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: backdoors, csam, encryption, icloud, iphones, privacy, scanning, security, surveillance
Companies: apple


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Michael, 13 Aug 2021 @ 12:54pm

    Of course there's an easy answer

    there are no easy answers here

    Easy answer: No one should be scanning anyone's device ever, with the exception of law enforcement armed with a warrant.

    I have no issue with Apple scanning your iCloud -- every cloud storage company's been doing this for years because it's their hardware.

    Apple scanning your phone is no different than Mazda searching my car. Fuck that.

    link to this | view in chronology ]

    • icon
      Greg Glockner (profile), 13 Aug 2021 @ 1:18pm

      Re: Of course there's an easy answer

      On-device scanning isn’t the problem. It’s when Apple quietly uses the data to report you to the police.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 13 Aug 2021 @ 1:20pm

      Re: Of course there's an easy answer

      Apple scanning your phone is no different than Mazda searching my car. Fuck that.

      Well, that's coming. It's already impossible to order some cars without cellular-based tracking features (OnStar etc.), and now we've got in-car cameras and microphones appearing for various purposes—e.g., cameras watching the driver's face to tell if they're not watching the road while using driver assistance features. It's only going to get worse, and there's not much you can do other than stock up on old cars.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 13 Aug 2021 @ 1:36pm

      Re: Of course there's an easy answer

      The problem is that people are falling for the idea that this is a "tech policy issue". Like, forget about child abuse: that doesn't involve technology, so we can't tech our way out of that. But pictures of child abuse (or non-abusive sexual pictures of children) do involve technology, and they make people uncomfortable, so let's just pretend that's the problem. Nevermind that we have no real evidence they're a significant driver of child abuse (and, in fact, there's some evidence that sexual pictures of adults reduce sexual abuse of adults—so maybe we're actually harming children by alleviating the discomfort of adults).

      You know, we have technology like mass surveillance, centralized storage, machine learning, and Apple can press a button and apply it to all our private shit. They have to do something, right? Doesn't do anything about people abusing children without taking photos. But, hmm, maybe those people have iPhones in their pocket or on the table while abusing the children, and if Apple could only activate the cameras and microphones—well, activate them differently, because Siri's already got the microphones always listening—maybe we'd help a few more children... Don't worry, though—we'd never use it for anything other than child sexual abuse material. And maybe non-sexual child abuse. And murder, it goes without saying. Maybe trade secrets, but only Apple's. But never for copyright infringement, 'cause we respect your privacy after all.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 13 Aug 2021 @ 1:02pm

    I wonder how long we have to wait till we have something like teens being SWATed for sexting.

    link to this | view in chronology ]

  • icon
    Greg Glockner (profile), 13 Aug 2021 @ 1:36pm

    A dangerous backdoor

    CSAM deserves to be prosecuted to the full extent of the law. Scanning for CSAM on the cloud is fair game. However, on-device scanning can and will lead to scope creep.

    What happens when China compels Apple to report pro-democratic content? When Mideast monarchs compel Apple to report homosexual content? When Germany compels Apple to report pro-Nazi content that traps research about fascism?

    Apple’s response is that we should all trust them to do the right thing. Do I trust Apple? Yes, I’m a loyal customer. Do I trust the US government? Generally. But do I trust all nations to treat their citizens fairly? Absolutely not.

    This feature is a despot’s dream, especially thanks to Apple’s infamous secrecy.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 13 Aug 2021 @ 2:03pm

      Re: A dangerous backdoor

      What happens when..

      The FBI compels apple to gather all encryption keys in use on a device? That it is theoretically possible for the OS vendor to compromise an OS is not disputed, but actually doing so is letting a whole herd of camels into the tent.

      link to this | view in chronology ]

      • icon
        Greg Glockner (profile), 13 Aug 2021 @ 4:23pm

        Re: Re: A dangerous backdoor

        I can’t see that happening. Remember, Apple said they don’t view images. However, Apple is alerted when a file hash (a fingerprint) on the device matches the database. Currently, Apple said the database only includes CSAM images. I believe them and generally trust them in the USA. However, once the technology is available, then, for instance, the Chinese government can force Apple to add pro-democracy image hashes to the database. To force Apple, the Chinese government gives a simple ultimatum: “you can’t manufacture or sell your products in China unless you comply”.

        link to this | view in chronology ]

        • icon
          Greg Glockner (profile), 13 Aug 2021 @ 4:32pm

          Re: Re: Re: A dangerous backdoor

          I should add: I believe Apple genuinely tries to defend individual privacy, and I expect they fight any requests for device decryption. Additionally, if Apple maintains no decryption keys, then they cannot fulfill such a request.

          link to this | view in chronology ]

        • identicon
          Anonymous Coward, 13 Aug 2021 @ 4:36pm

          Re: Re: Re: A dangerous backdoor

          The problem is that apple is about to let the nose of a camel into the tent, and when they do they will have difficulty in stopping it and the rest of the herd following it in. If apple implement this, they will have given law enforcement a very big lever for demanding other remote access to Apple systems.

          link to this | view in chronology ]

    • icon
      JoeCool (profile), 13 Aug 2021 @ 5:41pm

      Re: A dangerous backdoor

      What happens when China compels Apple to report pro-democratic content? When Mideast monarchs compel Apple to report homosexual content? When Germany compels Apple to report pro-Nazi content that traps research about fascism?

      Everyone switches the Android, that's what. :)

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 13 Aug 2021 @ 2:33pm

    Antivirus

    Based on the underlying concept behind many antivirus products, using AI and other filters to create a subset of virus signatures that can be detected, this doesn't seem that different. All that is being defined as a virus artefact is CSAM.

    What seems to be the fundamental issue is that a human behavior is being classed as a virus. Yes, I agree that those who enjoy CSAM need to be found and punished, but backdooring this seems like a massive problem.

    What if those who leaked incriminating documents had their rights to whistleblow on illegal documents had those purloined documents submitted to this "virus" database. And that'll most likely preceed those who sent sexual material to someone of their own gender, because, VIRUS!

    As we've seen before, Apple leads Microsoft, how long before such features get built into Defender (or whatever its called now)? Assuming that it can be disabled, won't that just lead to people removing basic security from their computers? And for those who sightly understand will just buy other AV software which will likely have similar functions and signatures?

    This won't end in the way Apple has planned

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 13 Aug 2021 @ 2:54pm

      Re: Antivirus

      This goes beyond AV in one critical way, AV tell the computer owner about the problem, this tell the OS vendor about the problem. The first does nor raise a privacy issue, (although Microsoft telemetry does), while the Apple proposal has the potential to eliminate all privacy.

      link to this | view in chronology ]

      • icon
        Greg Glockner (profile), 13 Aug 2021 @ 4:27pm

        Re: Re: Antivirus

        If antivirus software detects something, it alerts the computer owner. If Apple’s on-device CSAM scanner detects something, it alerts the police, who send a SWAT team to your door.

        These aren’t equivalent.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 13 Aug 2021 @ 2:34pm

    I wouldn't be surprised to learn that Apple has been compelled to scan photos by a powerful government that controls access to their markets and this is how they will sneak it in.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 13 Aug 2021 @ 4:55pm

    Nothing to see here

    Apparently, we're all just confused. Like when some people were holding their iPhone "wrong", it turns out it's our fault and not Apple's. We're just having trouble understanding.

    link to this | view in chronology ]

  • icon
    Uriel-238 (profile), 13 Aug 2021 @ 7:04pm

    Wow. It sucks to be an Apple user.

    Not that Google users or Windows users are much safer. In fact, Microsoft has reserved the right to search Win10 systems and keylog Win10 sessions for all sorts of allegedly benign purposes, with the assertion that they keep all rights to use what they find however they like whether to sell to rival companies or report crime to law enforcement. I really wonder how any business uses Win10 without (regularly) stripping all the spyware out. I only know businesses that do and who shrug about it.

    But so far we haven't seen any company go without controversy when it comes to either a) spying on more than what they promised or b) spreading that data around more than they promised, usually selling it to third parties. And we haven't really seen any company that has suffered dire consequences for having done so, such as getting liquidated and all the executive officers getting shunned from society.

    The US and EU don't care much about broken corporate promises.

    And projects like this always creep without notice.

    The only way to do something like this is with full, 100% transparency, so that Apple reports to the end user everything they scanned and what they found of interest, including if it's sus. And the user gets these reports before the police does.

    If they're not willing to do that, it's high time to jailbreak your phone and encrypt the shit out of it, or use all your extra data space for hello.jpg type images that would squick any human brain behind any human eyes. That or chose a new phone OS.

    Another option would be to fill any unused iCloud space with sexy sand dunes, extreme baby close-ups or whatever else might serve to provide AI false positives. There are Def Con hackers more creative than I regarding antagonistic input data to foul up image detection engines.

    Police in the United States are more interested in securing a white ethnostate than actually enforcing law, and alleged crimes are just a tool to them to that end. All this is going to do is provide more reasons for the DEA / FBI / ATF / local police to bust into nonwhite households and murder people where they live.

    In China or whatever country is rolling this out, I can only assume the motivations of their law enforcement are even worse.

    link to this | view in chronology ]

  • icon
    VTEX (profile), 13 Aug 2021 @ 11:50pm

    You would think it would be time to change the root password to the U.S. Constitution after all the damage that has been done - but damn do we have bad sysops.

    link to this | view in chronology ]

  • identicon
    Paul, 14 Aug 2021 @ 3:00am

    If APPLE opens up this Pandora's Box just one tiny bit

    Then flood of other snoops will follow, one salami slice at a time. APPLE better not jump on this skewered bandwagon, or, you know, there is UBUNTU and other common OSs for general computing and web browsing.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 14 Aug 2021 @ 8:16am

    The easy answer is for nosy companies to keep their data-grubbing fingers out of people's lives and property and not "take responsibility" for things that are none of their business.

    "Think of the children" is a bad excuse for worse behaviour.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 15 Aug 2021 @ 3:39pm

    Apple's internal plan runs thusly:

    We promise publicly to block GOVERNMENT requests for data.

    We will "sell" data to an NGO we setup. This NGO can sell the data to the government.

    Thus publicly we can say we ignore government requests, whilst funneling every document and image direct to the government's servers.

    We will start with "think of the children" for child porn, and slowly add to it. first CPM, then all images. Then text-only documents. then imessages, then SMS and finally every document will be "scanned" and reported on.

    The internal plans are SO damning, governments across the EU, UK etc are planning to abandon iPhones for work use entirely, regardless of cost as the security implications are just too high.

    Apple will be selling passwords to government authentication apps and websites stored on an iphone to whichever government will pay for it.

    link to this | view in chronology ]

  • icon
    Tanner Andrews (profile), 16 Aug 2021 @ 5:02am

    job applications

    safeguards, including having Apple employees review images before forwarding them

    There should be an interesting collection of job applications for these review positions.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 18 Aug 2021 @ 7:57pm

    Sarah has already pointed out this system works really poorly on legal drawings. Drawings make up the majority of Child Sexual Abuse Material (CSAM) in the world, particularly with things like anime abuse imagery. Some countries like Japan openly produce it, in spite of international condemnation.

    This is a sub-category of NPAI, Non-Photographic Abuse Imagery, an official and widespread term, which itself is a sub-category of CSAM. Remember, that this is a country by country list, so laws can vary. It is not required for it to be a real person, and it is no less abusive by failing to be a real person, as it is a representation of abuse, even if a fictional one.

    Unfortunately, it's very likely it could mistake a regular anime girl for an anime girl engaging in sexual activity, through factors like comparing the background.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Aug 2021 @ 9:56am

      Re:

      Everyone knows that all fictional characters are born older than the age of consent.

      /s

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.