Apple Undermines Its Famous Security 'For The Children'

from the this-is-dangerously-dumb dept

Apple is somewhat famous for its approach to security on its iPhones. Most famously, Apple went to court to fight the FBI's demand that they effectively insert a backdoor into its on-phone encryption (by being able to force an update to the phone). Apple has tons of goodwill in the security community (and the public) because of that, though not in the law enforcement community. Unfortunately, it appears that Apple is throwing away much of that good will and has decided to undermine the security of its phone... "for the children" (of course).

This week, Apple announced what it refers to as "expanded protections for children." The company has been receiving lots of pressure from certain corners (including law enforcement groups who hate encryption), claiming that its encryption was helping hide child sexual abuse material (CSAM) on phones (and in iCloud accounts). So Apple's plan is to introduce what's generally called "client-side scanning" to search for CSAM on phones as well as a system that scans iCloud content for potentially problematic content. Apple claims that it's doing this in a manner that is protective of privacy. And, to be fair, this clearly isn't something that Apple rolled out willy-nilly without considering the trade-offs. It's clear from Apple's detailed explanations of the new "safety" features, that it is trying to balance the competing interests at play here. And, obviously, stopping the abuse of children is an important goal.

The problem is that, even with all of the balancing Apple has done here, it's definitely moved down a very dangerous, and very slippery slope towards using this approach for other things.

Apple's brief description of its new offerings are as follows:

Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

Some of the initial concerns about these descriptions -- including fears that, say, LGBTQ+ children might be outed to their parents -- have been somewhat (though not entirely) alleviated with the more detailed explanation. But that doesn't mean there aren't still very serious concerns about how this plays out in practice and what this means for Apple's security.

First, there's the issue of client-side scanning. As an EFF post from 2019 explains, client-side scanning breaks end-to-end encryption. In the EFF's latest post about Apple's announcement, it includes a quick description of how this introduces a backdoor:

We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. Take the example of India, where recently passed rules include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of “misinformation” in 24 hours may apply to messaging services. And many other countries—often those with authoritarian governments—have passed similar laws. Apple’s changes would enable such screening, takedown, and reporting in its end-to-end messaging. The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.

We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society. While it’s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as “terrorism,” including documentation of violence and repression, counterspeech, art, and satire.

It's actually difficult to find any security experts who support Apple's approach here. Alec Muffett sums it up in a single tweet:

This is the very slippery slope. If we somehow believe that governments won't demand Apple cave on a wide variety of other types of content, you haven't been paying attention. Of course, Apple can claim that it will stand strong against such demands, but now we're back to being entirely dependent on trusting Apple.

As noted above, there were some initial concerns about the parent notifications, but as EFF's description notes, the rollout here does include some level of consent by users before their parents are notified, but it's still quite problematic:

In these new processes, if an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. If the under-13 child still chooses to send the content, they have to accept that the “parent” will be notified, and the image will be irrevocably saved to the parental controls section of their phone for the parent to view later. For users between the ages of 13 and 17, a similar warning notification will pop up, though without the parental notification.

Similarly, if the under-13 child receives an image that iMessage deems to be “sexually explicit”, before being allowed to view the photo, a notification will pop up that tells the under-13 child that their parent will be notified that they are receiving a sexually explicit image. Again, if the under-13 user accepts the image, the parent is notified and the image is saved to the phone. Users between 13 and 17 years old will similarly receive a warning notification, but a notification about this action will not be sent to their parent’s device.

This means that if—for instance—a minor using an iPhone without these features turned on sends a photo to another minor who does have the features enabled, they do not receive a notification that iMessage considers their image to be “explicit” or that the recipient’s parent will be notified. The recipient’s parents will be informed of the content without the sender consenting to their involvement. Additionally, once sent or received, the “sexually explicit image” cannot be deleted from the under-13 user’s device.

Whether sending or receiving such content, the under-13 user has the option to decline without the parent being notified. Nevertheless, these notifications give the sense that Apple is watching over the user’s shoulder—and in the case of under-13s, that’s essentially what Apple has given parents the ability to do.

It is also important to note that Apple has chosen to use the notoriously difficult-to-audit technology of machine learning classifiers to determine what constitutes a sexually explicit image. We know from years of documentation and research that machine-learning technologies, used without human oversight, have a habit of wrongfully classifying content, including supposedly “sexually explicit” content. When blogging platform Tumblr instituted a filter for sexual content in 2018, it famously caught all sorts of other imagery in the net, including pictures of Pomeranian puppies, selfies of fully-clothed individuals, and more. Facebook’s attempts to police nudity have resulted in the removal of pictures of famous statues such as Copenhagen’s Little Mermaid. These filters have a history of chilling expression, and there’s plenty of reason to believe that Apple’s will do the same.

There remains a real risk of false positives in this kind of system. There's a very worth reading blog post explaining how automated matching technologies fail, often in catastrophic ways. You really need to read that entire post as brief excerpts wouldn't do it justice -- but as it notes, the risk of false positives here are very high, and the cost of such false positives can be catastrophic. Obviously, CSAM is also catastrophic, so you can see how there is a real challenge in balancing those interests, but there are legitimate concerns that with this approach it's unclear if the balance is properly calibrated.

Obviously, Apple is trying to walk a fine line here. No one wants to be supporting CSAM distribution. But, once again, all the pressure on Apple feels like people blaming the tool for (serious) abuses by its users, and in demanding a "solution," opening up a very dangerous situation. If there were some way to guarantee that these technologies wouldn't be abused, or mess up, you could kind of see how this makes sense. But history has shown time and time again that neither of these things is really possible. Opening up this hole in Apple's famous security means more demands are coming.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: client side scanning, csam, encryption, for the children, parent notification, privacy, security
Companies: apple


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Samuel Abram (profile), 6 Aug 2021 @ 12:03pm

    There it is…

    Whomsoever said "CSAM is the rootkit to the constitution" hit the nail on the head, and here it is in realtime.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Aug 2021 @ 12:23pm

      Re: There it is…

      Defnitely, as scans can be used to search for anything, terrorist materials, insulting Gollum, any thing to be used against political activists.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Aug 2021 @ 3:22pm

      Re: There it is…

      That’s only our own choosing. It’s completely possible for governments to take a more nuanced approach that targets some aspects of CSAM, such as origination or commercial distribution, as a crime while not going after individual possession with criminal charges or invasive technical measures. Instead, we’ve taken a problem which is insoluble by its very nature (“Stop all creation and sharing of abuse images by anyone, anywhere”) and it’s been to grant a blank check to law enforcement for very invasive measures that harm civil liberties and don’t stop CSAM.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 6 Aug 2021 @ 9:58pm

        Re: Re: There it is…

        I don't know if I agree with your entire post, but I feel it might be good to move away from surveillance or the police as the ultimate solution to every single problem a bit.

        Is it possible to get computer graphics to the point where you can create child porn, without the children (artificial child porn)? I like the idea of getting someone to have sex with a robot (is it realistic enough to catch their attention?), rather than an actual child too.

        It might be possible to reduce all sorts of child abuse by reducing the number of covid lockdowns, as this has been strongly correlated with online child porn distribution (increase by 25%), and offline child abuse. But, this will require people to actually get vaccinated, which I can't believe is so hard to do in countries desperate to get out of this covid hell.

        We are literally in a position where anti-vaxxers are happy to come up with every excuse as to why people shouldn't get vaccinated, and these are usually the most "think of the children" sorts too. As it's an international problem, we'd need to ramp up vaccinations globally, and supplies may be thin in many countries.

        Anti-abuse campaigns could be more effective by explaining how molestation and rape is harmful to a child, and how someone can fool themselves into thinking they're doing "something minor", and to avoid that pitfall.

        They overwhelmingly focus on things being "immoral" or "illegal", and that mentality inevitably gets extended to deterring people from using any sort of artificial child porn (which might actually be counter-productive) by spooking them by making it look like the state is watching them, and ready to arrest them. It is the harms to the child which makes it bad, not simply that someone might find it offensive.

        For grooming on social media, we'd have to run online safety campaigns, and strengthen social supports, so that teens aren't so desperate for emotional supports online, where they might run into wily predators. Facebook is taking steps to make it harder for them to be exploited too.

        Many exploited children are LGBT+. Reducing the amount of anti-LGBT hostility could make them a lot less vulnerable to predators, although some portions of society are very happy to exploit them to pass totalitarian laws, but very unwilling to actually help them.

        Stopping child porn is an impossible problem, but we might be able to decrease it. Getting all crimes to zero should never be an absolute goal in a democratic society, as it is impossible to do so without turning this society into a totalitarian one, and that may still not be enough.

        link to this | view in chronology ]

      • icon
        Manabi (profile), 8 Aug 2021 @ 1:06pm

        Re: Re: There it is…

        Most people don't realize this, but that's because the vast majority of the organizations pushing for stronger and stronger laws against "CSAM"¹ have as their root goal banning all pornography of all types. They use the "think of the children" angle to make it difficult for people to push back against the laws. Then once the laws are passed, and people start believing the arguments that merely viewing "CSAM"¹ will make people rape children, they can start arguing that regular pornography must be banned as well. Because clearly, if viewing "CSAM"¹ makes people rape children, viewing adult pornography will make people rape adults.

        It's insidious, because it's very hard to argue against without being labeled as pro-child-sexual-abuse. And they simply are not happy no matter how restrictive the laws get. They want 2D drawings banned as "CSAM"¹, even in cases where the characters are 18+, but have "child-like" features. They're very much like copyright extremists in that regard.

        They're anti-porn zealots, who don't actually care all that much about actual children. If they did, they'd realize some of the things they insist on cause quite a bit of damage to children. Just ask the teens and preteens who've been charged with production of "CSAM"¹ for sending nude photos of themselves to another preteen/teen. Or ask any of the children who get sexually abused by a family member, while most of society is focused on "stranger danger," because that's scarier and easier to use to get bad laws passed.

        ¹ The above is why I dislike the term CSAM. To the NCMEC simple nude pictures of children are CSAM. Nude photos are simply not automatically child sexual abuse, as any parent who's taken a photo of their toddler in the bath can tell you. When you read an article about someone being busted for having hundreds of photos of "CSAM" on their computer, most of those are probably nude photos, not actual sexual activity. It lets them make things sound far worse than they really are.

        link to this | view in chronology ]

        • icon
          Samuel Abram (profile), 8 Aug 2021 @ 4:49pm

          Re: Re: Re: There it is…

          NCMEC?

          link to this | view in chronology ]

          • icon
            Samuel Abram (profile), 8 Aug 2021 @ 4:50pm

            Re: Re: Re: Re: There it is…

            Oh, never mind, it means "National Center for Missing & Exploited Children".

            link to this | view in chronology ]

        • identicon
          Anonymous Coward, 8 Aug 2021 @ 5:35pm

          Re: Re: Re: There it is…

          NCMEC also parrots the idiot idea, darling of legislators everywhere, that end-to-end encryption should be compromised by law enforcement backdoors: https://www.missingkids.org/theissues/end-to-end-encryption

          Which I guess makes sense given their project here with Apple.

          The whole organization is sketchy as hell, it's privately operated but was created by congress and receives government funding.

          link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 9 Aug 2021 @ 1:47am

          Re: Re: Re: There it is…

          "...the vast majority of the organizations pushing for stronger and stronger laws against "CSAM"¹ have as their root goal banning all pornography of all types."

          ...because the most vocal such organizations are all religious in nature. Cynically put they use "For the children" as a wedge to make shit harder for teenagers having premarital sex.

          Quite a few such organizations once started as credible anti-abuse NGO's focused on stopping sex trafficking and sex tourism to countries where underage prostitution was common (Thailand, notably) but these organizations were rapidly infiltrated and taken over by the religious right who, bluntly put, doesn't give a single solitary fuck about the welfare of endangered children but certainly goes the distance to object against people of any age humping one another for any other purpose than procreation within the bounds of wedlock.

          Hence why some countries, surprisingly enough my own, has an age limit of what counts as CSAM which is higher than the actual age of consent. Making it fully legal for a 50-year old and a 16 year old to get it on but god help the two 17-year olds sexting each other.

          ...which neatly seagues into how in the most religious US states, child marriages are still legal and under no particular threat from the religious community so bent on banning sex, making it possible to marry the criminally young but not for two teens to be a couple.

          link to this | view in chronology ]

  • identicon
    anon, 6 Aug 2021 @ 12:31pm

    I don't think so.

    I bet they got an NSL and this was their most face-saving way to push a photo-id app to every iPhone on the planet.

    link to this | view in chronology ]

  • icon
    Samuel Abram (profile), 6 Aug 2021 @ 12:31pm

    What about Signal?

    Does this mean that Signal is compromised? OMG, this is the worst news!

    link to this | view in chronology ]

    • identicon
      anon, 6 Aug 2021 @ 12:34pm

      Re: What about Signal?

      No, Signal isn't compromised. Apple's IOS is compromised. This is worse because you can't just download a different-but-equally-secure application to duplicate its functionality. Especially if there's 'match everything' hash in that collection...

      link to this | view in chronology ]

      • icon
        Manabi (profile), 8 Aug 2021 @ 1:12pm

        Re: Re: What about Signal?

        The worst parts of this are the machine-learning scanning for sexual images, and that is limited to iOS' Messages app. Teens/preteens wanting to sext without Apple's nanny-state scanning can simply use another app. Seems pretty pointless to me, as that's exactly what teens/preteens will do.

        The part about scanning for CSAM is scanning against a database of known CSAM material maintained by the NCMEC. It won't flag the photo a parent takes of their toddler playing in the bath, but the fact that people are assuming it will should worry Apple. People are assuming the worst, and this is going to be a PR nightmare for them because of it.

        link to this | view in chronology ]

        • icon
          Samuel Abram (profile), 8 Aug 2021 @ 4:56pm

          Re: Re: Re: What about Signal?

          Teens/preteens wanting to sext without Apple's nanny-state scanning can simply use another app.

          Who controls the App store on iPhones?

          link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Aug 2021 @ 12:45pm

      Re: What about Signal?

      No, signal is not compromised, but the scanner could send the plaintext that you enter, or decrypt straight to Apple, and that applies to any encryption system running on the phone.

      link to this | view in chronology ]

      • icon
        Samuel Abram (profile), 6 Aug 2021 @ 12:47pm

        Re: Re: What about Signal?

        the scanner could send the plaintext that you enter, or decrypt straight to Apple, and that applies to any encryption system running on the phone.

        That doesn't really comfort me.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 6 Aug 2021 @ 1:21pm

          Re: Re: Re: What about Signal?

          It means that Apple devices cannot be considered for secure communications.

          link to this | view in chronology ]

          • icon
            Samuel Abram (profile), 6 Aug 2021 @ 1:24pm

            Re: Re: Re: Re: What about Signal?

            It means that Apple devices cannot be considered for secure communications.

            I'm an Apple User. I'm going to have to look elsewhere.

            link to this | view in chronology ]

            • identicon
              Anonymous Coward, 6 Aug 2021 @ 2:42pm

              Re: Re: Re: Re: Re: What about Signal?

              For real secure communications you need to go to an offline system for encryption and decryption, along with a safe way of copying encrypted files to and from the online machines. The latter is possible by using an Arduino as a SD card reader and writer. Its the usual problem, you can have security or you can have convenience.

              link to this | view in chronology ]

              • icon
                Samuel Abram (profile), 6 Aug 2021 @ 2:43pm

                Re: Re: Re: Re: Re: Re: What about Signal?

                Thanks.

                link to this | view in chronology ]

                • identicon
                  Anonymous Coward, 6 Aug 2021 @ 3:04pm

                  Re: Re: Re: Re: Re: Re: Re: What about Signal?

                  In practices, and with some care in managing the system, Linux or BSD are secure enough, unless you are being targetted. If you are being targetted, you need to protect your offline system from a black bag job.

                  link to this | view in chronology ]

          • identicon
            Anonymous Coward, 10 Aug 2021 @ 11:56am

            Re: Re: Re: Re: What about Signal?

            Why do they want to be the new RIM/BlackBerry so badly?

            link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2021 @ 12:34pm

    I'm guessing next will be people reporting non-iPhone users to the authorities because they don't have an iPhone and clearly support CSA.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2021 @ 12:36pm

    No one wants to be supporting CSAM distribution.

    This is obviously false; if true, there would be no need to use technology to stop it.

    Obviously, CSAM is also catastrophic

    That's not obvious at all. Child abuse is certainly harmful, with sexual abuse being a subset of that, but we're talking about pictures of abuse (along with entirely non-abusive pictures, e.g. those taken and sent willingly). Not good, but "catastrophic" is hyperbole; is it so much worse that photos of other crimes, such as murders or beatings, which are perfectly legal to posess?

    The overreactions to these pictures are certainly harmful, including to children.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 8 Aug 2021 @ 9:27am

      Re:

      Yep, and don't forget that drawings can be considered CSAM in this Orwellian society.

      link to this | view in chronology ]

    • icon
      nasch (profile), 10 Aug 2021 @ 8:29am

      Re:

      is it so much worse that photos of other crimes, such as murders or beatings, which are perfectly legal to posess?

      If people are committing murders or beatings to sell the photos of them then no. If not, then that's the difference. I'm not saying the way we're tackling it is the right way, but the focus on the photos and videos is because in some cases the abuse is being done in order to produce the photos and videos.

      link to this | view in chronology ]

  • icon
    That One Guy (profile), 6 Aug 2021 @ 12:48pm

    'Solving' a sliver by removing the arm

    The encryption issues are bad enough but one line stuck out to me as an indication that they really did not think this one through:

    In these new processes, if an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. If the under-13 child still chooses to send the content, they have to accept that the “parent” will be notified, and the image will be irrevocably saved to the parental controls section of their phone for the parent to view later.

    The software flags content of under-13 children as sexually explicit and rather than block it they save it to the device. There is just so many ways that can and will go horribly wrong, from prosecutors bringing charges for knowing possession of CSAM(before anyone chimes in to say that would never happen teenagers have been charged for creating/possessing CSAM of themselves) to someone else getting access to those photos whether from a used phone that wasn't wiped or simply loaning it to someone.

    If the goal was to reduce CSAM then this very much does not seem the proper way to go about it.

    link to this | view in chronology ]

    • This comment has been flagged by the community. Click here to show it
      icon
      Koby (profile), 6 Aug 2021 @ 1:04pm

      Re: 'Solving' a sliver by removing the arm

      This might open an avenue to subversion. Get a burner phone, acquire some prohibited material, and send it to a judge or prosecutor. Someone else could then "leak" some rumor about the subject. Lo and behold, the proof is on their device. If it is demonstrated that their system is untrustworthy, then perhaps the manufacturer will decide to discontinue it.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 6 Aug 2021 @ 11:38pm

        Re: Re: 'Solving' a sliver by removing the arm

        Get a burner phone, acquire some prohibited material, and send it to a judge or prosecutor. Someone else could then "leak" some rumor about the subject. Lo and behold, the proof is on their device.

        Why do people like you come up with such crazy ideas in order to fuck somebody over.

        It's like you are mad at life and spend all your waking hours trying to find new and creative ways to make somebody else's life just as miserable as your own.

        The sooner you fuck off, the better everybody else will be.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 8 Aug 2021 @ 12:03pm

          Re: Re: Re: 'Solving' a sliver by removing the arm

          Not crazy at all. The numver of investigations where this materials has been conveniently found and used to tap into phones and communications is not small. Especially in 2nd world countries where police needs additional proofs, they fabricate them.

          link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 9 Aug 2021 @ 2:48am

          Re: Re: Re: 'Solving' a sliver by removing the arm

          "Why do people like you come up with such crazy ideas in order to fuck somebody over."

          That One Guy is correct in his assessment which means Koby's just taking it from there.

          I don't usually spring to Koby's defense given his penchant for lying through his teeth concerning free speech issues, but in this case he's right. If it is possible to use existing law to drop wholly innocent people - children and parents alike - in some utterly horrifying legal nightmare then there will be ten thousand trolls to whom the idea of hurting other people precisely this way is a sexual fetish in itself.

          link to this | view in chronology ]

      • icon
        That One Guy (profile), 7 Aug 2021 @ 12:47am

        Re: Re: 'Solving' a sliver by removing the arm

        Yes, engaging in what I'm sure would be multiple felonies in a way that would garner a lot of government attention would definitely be the smart and proper way to point out why this move is a bad one, brilliant idea there.

        link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 9 Aug 2021 @ 2:51am

          Re: Re: Re: 'Solving' a sliver by removing the arm

          Unfortunately he's probably correct. A number of the friends he keeps advocating for are among the people who would gladly and with gay abandon set out to drop innocent people into horrifying legal messes. Specifically targeting BLM and ACLU advocates, liberals, transgender activists and such.

          link to this | view in chronology ]

        • identicon
          Anonymous Coward, 9 Aug 2021 @ 11:41pm

          Re: Re: Re: 'Solving' a sliver by removing the arm

          Holy shit you guys, I think we found John Smith!

          link to this | view in chronology ]

    • icon
      That Anonymous Coward (profile), 6 Aug 2021 @ 1:52pm

      Re: 'Solving' a sliver by removing the arm

      And here it was pitched in other media as them comparing things uploaded to iCloud to the list of known CP.
      Then they would be reviewed by a human.
      Does that mean that Apple secret viewers would then be in possession of CP?

      Oh look a shitshow to get some good PR that will definately not completely undermine or privacy & security and make uploading a winnie the pooh picture in China an arrestable offense... oh wait.. it is.

      Sorry I fscking refused to use the longer & longer name.
      Google George Carlin's bit about shell shock.
      We keep giving things longer and longer names & it takes away some of the horror we should be feeling about horrific things.
      You wouldn't call rape non-consensual coitus, that makes it sound like something different & maybe not as bad but we make things longer & more polite for some peoples sensibilities being offended.
      Kids were exploited, photographed, raped how the fsck does CSAM convey what an actual horror that is?

      "a used phone that wasn't wiped or simply loaning it to someone."
      Or taking it into the Apple store only to discover he sent himself all your pics & contact details then was hitting on you?

      link to this | view in chronology ]

      • icon
        Manabi (profile), 8 Aug 2021 @ 1:54pm

        Re: Re: 'Solving' a sliver by removing the arm

        We know that Google & Facebook hire people to review reported images & videos already, and that those people have a high turnover rate and develop psychological problems because of the images they're required to view. (Not just child pornography but graphic and brutal torture, gore, etc.) Apple probably does as well for iCloud, but they'll have to hire a lot more if they go through with this.

        Also it should be pointed out that most of the people hired to do this are contractors, not direct employees and don't get the nice benefits normal employees get. Often that includes them not have health insurance through the job at all, as well as being paid close to minimum wage, so they can't even afford to get psychological help to deal with the things the job requires them to view daily. This will help no children but harm a lot of adults.

        I'm with you on not calling it CSAM, but child pornography's also often a misnomer as well. The NCMEC database contains a LOT of photos that are simply nudes of children not engaging in sexual activity. That's not what people think of when they hear child pornography, so renaming it to "child sexual assault material" was really uncalled for. They're deliberately trying to make it sound worse than it often is. Whenever you hear someone busted for having CP on their computer has "hundreds" or "thousands" of images, probably less than 10% of those are what people actually think of as pornographic, much less "sexual assault material."

        And let's not forget they want 2D drawings banned as CP as well. Those don't involve any actual children, so no children are possibly harmed by them.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2021 @ 12:51pm

    Ambiguity

    Is a belly-up picture of a nude (from the belly up) person "sexually explicit"?

    Does it matter if the person is female or male? (or "other")

    If you chose "yes" and "yes" for classifying the picture as "CSAM", please describe your method for reaching your determination. Only methods with a 99% reliability need apply.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2021 @ 1:12pm

    So now when one minor sends an explicit pic to another minor Apple is going to make sure to save a permanent copy of that pic AND send it to parents to view?

    I see no issues there.

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    You Rip What You Sew, 6 Aug 2021 @ 1:44pm

    Private corporation w 1A Rights to not associate w child porn.

    Don't like it? Get another brand of gadget.

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    You Rip What You Sew, 6 Aug 2021 @ 1:44pm

    Private corporation w 1A Rights to not associate w child porn.

    Don't like it? Get another brand of gadget.

    link to this | view in chronology ]

    • This comment has been flagged by the community. Click here to show it
      identicon
      You Rip What You Sew, 6 Aug 2021 @ 1:45pm

      Re: Private corporation w 1A Rights to not associate w child por

      This also proves that you do NOT actually own what you bought: it's fully under Apple's control.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Aug 2021 @ 1:57pm

      Re: Private corporation w 1A Rights to not associate w child por

      If you can't see the difference between controlling what appears on a public noticeboard, and rummaging through you house for thing someone does not like, you have serious problems in understanding people rights.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 8 Aug 2021 @ 8:08pm

      Re: Private corporation w 1A Rights to not associate w child por

      Not so good buddy. That ones a repost.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    You Rip What You Sew, 6 Aug 2021 @ 1:45pm

    By the way, kids, this IS the dystopian 21st century.

    It's a hoot that you all mutter over this tiny increment of surveillance capitalism / Big Nanny State, having ignored my shrieking for years.

    link to this | view in chronology ]

    • This comment has been flagged by the community. Click here to show it
      identicon
      You Rip What You Sew, 6 Aug 2021 @ 1:46pm

      Re: By the way, kids, this IS the dystopian 21st century.

      After your token grumbling, you'll have it normalized again and go back to advocating corporatism that guarantees as GOOGLE said: "YOU HAVE NO PRIVACY".

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 6 Aug 2021 @ 2:19pm

        By the way, kids

        By calling everybody kids, it sounds like you are the one with the CSAM issues. You're a real sick fuck, aren't you.

        link to this | view in chronology ]

    • This comment has been flagged by the community. Click here to show it
      identicon
      You Rip What You Sew, 6 Aug 2021 @ 1:47pm

      Re: By the way, kids, this IS the dystopian 21st century.

      You kids deserve to get corporatism good and hard. I bet you'll be the first generation made into Soylent Rainbow, starting with the most useless kibitzer / biggest eater: Maz.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 6 Aug 2021 @ 6:46pm

        Re: Re: By the way, kids, this IS the dystopian 21st century.

        Spambot says stupid shit, gets downvoted by the community.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2021 @ 1:50pm

    Apple, Google, Facebook and their ilk are clearly not the ones at fault here. It is time we face the fact who our true enemies are: The Children. Techdirt, since time immemorial, have hinted at their ungodly powers to sway the will of the most powerful corporations and governments. We need to stop them.

    Personally, I have never seen one of these little fuckers so I have no idea how we can defeat them but we have to try.

    Because if we don't, then... The Children have already won.

    link to this | view in chronology ]

  • icon
    Snodoubt (profile), 6 Aug 2021 @ 1:55pm

    New iCloud ad coming soon - “now with 100% more rainbow table

    Phew! I was worried they were going to undermine my phone’s security. It’s nice to know that they just need to hash every word and sentence combo in the world and then they can decrypt my e2e encrypted iMessage if they hash match. Since they have been working on this for years, I’ll assume that’s already taken care of.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2021 @ 1:59pm

    This is a bullshit argument. This scanning technology already exists, whether or not it's being used. And if any government decides they want Apple to implement this for nefarious purposes, they can do that today, whether or not it's already in use. And in that case, Apple has a decision to make; note that they already threatened to pull out of the UK market due to a possible government edict. And no one (other than the usual army of Apple haters) could argue with a straight face that Apple itself would do this for anything other than a good cause.

    link to this | view in chronology ]

    • icon
      nasch (profile), 10 Aug 2021 @ 8:46am

      Re:

      And no one (other than the usual army of Apple haters) could argue with a straight face that Apple itself would do this for anything other than a good cause.

      I haven't seen anyone arguing that. The problem is, who gets to decide which causes are good?

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2021 @ 2:10pm

    Fucking children, always being used as the justification for this shit.

    link to this | view in chronology ]

  • icon
    PerfectBlend (profile), 6 Aug 2021 @ 2:30pm

    Shoutout & slippery slope

    A few years ago, TechDirt was mentioned @GOGPodcast. And I followed TechDirt. I was a bit surprised.Who is this "Mike dude" making sense all the time? So, I've started listening to the Podcast. I cannot remember a single episode to disagree on. I've been a long time EFF supporter. Not because I think the EU is so much better, but good stuff and bad stuff radiates to the EU (and visa versa). So, the nonsense has moved from the internet to my backyard. First the ridiculous copyright act and now Apple. We're all emphatic to fighting child abuse. Often forgotten, it usually comes from known people to the child. The unpleasant truth is, once you start to mess with security (and privacy in private spaces) the ship has sailed. My cynical tweet "It takes one gag order" (to change this thingy into a nightmare) isn't so far from the truth. We all know where this starts, nobody knows where it will end. Let's hope it doesn't end with the "legal but inappropriate" proposal our UK friends are trying to avoid.

    link to this | view in chronology ]

  • icon
    scotts13 (profile), 6 Aug 2021 @ 2:56pm

    Last straw

    I've worked for and with Apple since 1983. My house is full of their stuff to this day. They've done some things I didn't agree with, but overall I overwhelmingly supported the company. This surveillance is 100% the opposite of everything I thought they stood for. It rolls out, and I'm done.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2021 @ 3:37pm

    Hasn't Apple, by definition, caved, just by doing this? We should trust Tim Cooke who funnels Apple's profits offshore to evade taxes? Who changes cable standards every few years to increase revenues then pulls cables to save pennies. Cooke would sell his mother if it reduced costs, which is why we're in this mess, because they went to China.

    And haven't they also caved to demands by China, because, well, they could pretty much shut down their entire business? All your data is stored on Chinese servers, if you are Chinese?

    I buy Apple products because of the "implied" higher standard of security. It may or may not actually exist, or not as well as I'd like, but I trust them more than say Amazon with my information. Thing is, the competitors are almost all cheaper and if there's just a difference without a distinction then, well, I might as well save some money.

    link to this | view in chronology ]

    • icon
      That One Guy (profile), 6 Aug 2021 @ 4:53pm

      Re:

      'Sir, we seem to have dug ourselves a hole and given ourselves a PR black-eye in the process with this latest move, what should we do?'

      'Keep digging!'

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 7 Aug 2021 @ 8:55pm

      Re:

      This is honestly bordering on QAnon level shit.

      I also resent any argument set up to make opposition to it appear inherently morally repugnant. I have been called pro-criminal far too many times for my opinions on the government to find the least bit of good faith in that strategy. If that's all they can come up with they have nothing.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 6 Aug 2021 @ 6:57pm

    The best part about techdirt articles is reading all the outraged, pearl-clutching comments.

    Anyways, doesn't apple have like two billion ios devices out there? How are they going to manage it all? Is this just for certain countries? I mean, in some places, like Afghanistan, they marry off girls as young as 11!

    I foresee a great big reversal of this in less than two years.

    The stupidity of this whole scheme just screams invented at apple, probably by its legal dept. Most of apple's innovations have been other people's ideas.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Aug 2021 @ 7:04pm

      Re:

      Order of precedence of apple divisions in terms of salary of employees:

      Legal

      Marketing

      Chip Design

      Software

      Hardware

      (apple is one of the most unreliable hardware of all laptops I've owned)

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Aug 2021 @ 10:09pm

      Re:

      I don't think Afghanistan likes any porn. They don't really single out child porn.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 7 Aug 2021 @ 1:08am

    This "feature" also fails to take into account that there are fuck ups everywhere, including parents. I don't think it will be long that we hear stories of parents... "misusing" this system.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 7 Aug 2021 @ 9:01pm

      Re:

      Ya an abusive caretaker with power over the phone is another terrible situation. Apple could inadvertently hand deliver them explicit pics of their victim. A step parent having control is even more likely to lead to this.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 7 Aug 2021 @ 1:49am

    The trouble is that ads can bring illegal material into your device without knowing it, which is why blocking ads is a necessary evil.

    Since Apple does not allow ad blocking apps to be installed, that is a good argument to stay the hell away from Apple products.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 7 Aug 2021 @ 2:02pm

    Isn't what they are doing a first step towards content moderation at OS level? Obvious that sooner or later they would find a way to make it accepted by the public. It's all about the narrative spin. Maybe an alternative will come out - I am thinking about a raspberry dongle or something similar that uses the phone just as a dongle.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 7 Aug 2021 @ 2:36pm

    https://appleprivacyletter.com/ Help sign the open letter to oppose Apple's invasive actions!

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 7 Aug 2021 @ 3:48pm

    What about all the other contributers?

    So what are the various automobile companies doing to make sure that they're not somehow a part of distribution... obviously, some sort of transportation took place for that phone to make it to the end user... and they might drive somewhere to take those CSAM pictures.
    And how about the power companies? Are they monitoring what's happening with all that electricity? Certainly there'd be less CSAM if there were no electricity for it!
    And all grocery stores and restaurants allowing people to eat there without making sure there's no CSAM happening... otherwise, they're providing nourishment for this to continue.

    Oh the humanity!

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 Aug 2021 @ 8:24pm

    When yo dick so smol sexting your girlfriend triggers the filter.

    link to this | view in chronology ]

  • icon
    n00bdragon (profile), 9 Aug 2021 @ 6:51am

    And, obviously, stopping the abuse of children is an important goal.

    At the risk of sounding like a child-molesting nogoodnik: How important is this, really? Has there ever been any attempt to quantify, in numerical terms, how many children are sexually abused via iCloud? From the way Apple's press release is worded you would think that iCloud was a digital cornucopia of child abuse photographs that America's legion predators took for some reason. Saying something like "We just don't know how many abuse images there could be" is like saying there is just no way of telling how much terrorism is planned out via the English language. How many actual real pedophiles is this change going to reel in? Can anyone take even an order-of-magnitude guess?

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Aug 2021 @ 11:48pm

      Re:

      Realistically? If testimonies of actual victims are anything to go by, the odds are not great. Most of the actual offenders live in supportive communities that willingly turn a blind eye, and are very well shielded by law enforcement for being high up the social food chain.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.