Apple Recognizes It Jumped Too Quickly On Its CSAM Detection System; Delays Implementation

from the good dept

Sometimes speaking out works. A month ago, Apple announced a series of new offerings that it claimed would be useful in fighting back against CSAM (child sexual abuse material). This is a real problem, and it's commendable that Apple was exploring ways to fight it. However, the major concern was how Apple had decided to do this. Despite the fact that a ton of experts have been working on ways to deal with this extremely challenging problem, Apple (in Apple fashion) went it alone and just jumped right in the deep end, causing a lot more trouble than necessary -- both because their implementation had numerous serious risks that Apple didn't seem to account for, and (perhaps more importantly) because the plan could wipe away years of goodwill in conversations between technologists, security professionals, human rights advocates and more in trying to seek solutions that better balance the risks.

Thankfully, with much of the security community, the human rights community, and others calling attention to Apple's dangerous approach, the company has now announced a plan to delay the implementation, gather more information, and actually talk to experts before deciding how to move forward. Apple put (in tiny print...) an update on the page where it announced these features.

Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

It's good that the company has finally realized that it moved too quickly on this and hadn't necessarily understood the ramifications of its decision. It remains to be seen whether the company will actually do more in terms of realizing how dangerous its approach was, or if it will simply look to make a few cosmetic changes to the system.

Notably, this announcement came out just as Scientific American released an interesting article that warns that one of the child safety features -- one that had received less concern than the others -- might harm children more than it helps. This is the "communication safety in messages" feature that would scan iMessages of kids under 13, blur messages that the system deemed sexually explicit, and alert parents if the kid sends or opens such a message.

There were some initial concerns about this -- especially regarding LGTBQ children whose parents might not be understanding. However, many of those initial concerns were quieted by the details of the program -- including the fact that it was opt-in and was designed to be transparent to both the kids and the parents what was happening (so no sneaky surveillance or surprise alerts). It's also specifically designed for child accounts that are set up in Family Sharing.

However, as the SciAm article notes, this system still should raise concerns, because it would make kids think they're always being watched:

In fact, even by having this feature, we are teaching young people that they do not have a right to privacy. Removing young people’s privacy and right to give consent is exactly the opposite of what UNICEF’s evidence-based guidelines for preventing online and offline child sexual exploitation and abuse suggest. Further, this feature not only risks causing harm, but it also opens the door for wider intrusions into our private conversations, including intrusions by government.

We need to do better when it comes to designing technology to keep the young safe online. This starts with involving the potential victims themselves in the design of safety systems. As a growing movement around design justice suggests, involving the people most impacted by a technology is an effective way to prevent harm and design more effective solutions. So far, youth haven’t been part of the conversations that technology companies or researchers are having. They need to be.

Again, there are real concerns here, and parents obviously want to protect their children. But over and over again we've seen the way you do that is by teaching them how to handle dangerous situations, rather than adding yet another layer of surveillance. The surveillance teaches them not only that they have no privacy, but similarly takes away their own agency in learning how to deal with difficult situations themselves.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: children, client side scanning, csam, scanning, security
Companies: apple


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 3 Sep 2021 @ 3:24pm

    The problem with the Apple proposal, along with Microsoft's telemetry, is that they demonstrate code implemented to monitor and exfiltrate what users are doing live. How long before some government decides that phone tap laws apply, and arrive the companies with wiretap orders in hand?

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Sep 2021 @ 4:30pm

    I'm guessing someone is already accusing Apple of assisting predators now. Apparently people still think Barbara Lee spit in the face of trafficking victims by voting no on FOSTA.

    link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 3 Sep 2021 @ 7:48pm

    Or in the alternative...
    Parents could, i dunno, parent.
    Take 5 minutes, have a conversation with them.
    Warn them whats out there, tell them they can always come to you for help and all those other wacky parenting ideas that were ended because tech can do it for us.
    gasp have that birds & bees talk that so terrifies you... or would you prefer to be on Maury supporting your child claiming it had to be 1 of the 5 guys she hooked up with?

    There is an informational void out there & do you really want it filled by answers on Google or from other teens who heard that one time if you guzzle coke then swallow a mentos you can make sure you won't get pregnant.

    Y'all made the decision to have kids and personally I am really fucking tired of having more hoops to jump through put in my path because you can't bother to supervise your kid so you demand everyone else do it for you.
    If you are to scared to explain the birds and the bees, perhaps you aren't ready to have kids.
    If you can't spend 5 min a day checking in on your kids & actually listen to them, perhaps you aren't ready to have kids.
    If you won't explain what you expect & actually have consequences when they fail to meet the expectations, perhaps you aren't ready to have kids.

    All of the Teen Mom reality shows were supposed to be a warning, but instead you made them famous so kids want to be like them. But its societies fault your kid didn't know sex makes babies even the very first time.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2021 @ 10:49am

      Re:

      Modern parents aren't that different that previous generations, just the medium has changed. It used to be plop them in front of the TV, now its give them a phone or tablet, let the schools and society take care of the child and then bitch about how bad of a job teachers, tv channels, or tiktok is doing. And the capitalistic system isn't doing anybody a favour when you need to be constantly working just to pay the bills.

      link to this | view in chronology ]

  • icon
    Will (profile), 4 Sep 2021 @ 1:08am

    This fight is already over

    It is no longer important what Apple chooses to do (or not do). Now that Apple has publicly stated its belief that it could successfully implement its original content detection system, any government with the appropriate legal power could compel Apple to implement that system (tuned to detect whatever content the government specifies). Technical infeasibility is the only argument governments cannot (easily) legislate around when it comes to facilitating surveillance; now Apple can’t argue that ‘secure’ and ‘private’ ‘targeted’ on-device scanning can’t be done because it has already built the tools to do it!

    link to this | view in chronology ]

    • icon
      JohnB (profile), 4 Sep 2021 @ 3:59pm

      Re: This fight is already over

      Agreed. At the time of Apple's initial announcement, my thought was that what they were trying to do was normalize a requirement for endpoint-scanning already placed on them by China. However, if that wasn't the case, I wondered how long it took after the announcement for Tim Cook to receive a phone from the Chinese government explaining what they would be doing in the future if they wanted to sell/manufacture iPhones in China. Despite Apple's apparent climbdown, my belief is that if they are still doing business in China in a month, endpoint-scanning will be live there. I have been a long time Apple customer, but I will now never buy another Apple device or update my OS until I retire my current devices.

      link to this | view in chronology ]

  • identicon
    Christenson, 4 Sep 2021 @ 9:58pm

    Apple, Please Actually help children

    Dear Apple:
    The experts report tells us
    1) The evidence for what works is weak, but
    The following seems to be the crux of what does work:
    2) "the critical need to make children (and parents) aware of the risks of exploitation and abuse and equip them with the knowledge, skills and tools to protect themselves and to seek help and report abuse when it happens"

    So if you really want to reduce the amount of abuse, or even bullying, you need to
    1) educate people to recognize abuse generally and point them to effective help. This has to not be targeted in a way that implies anyone has actually inferred abuse, and the advanced parts for the more interested must preserve privacy and not be remembered on personal endpoint computers and phones where it may be viewed.
    2) ensure that the help and reporting systems both exist and are effective.
    3) Support non-politicized research into some extremely complicated questions:
    a) what actually determines whether or not people abuse others, besides simply being human beings in positions of power?
    b) what are the real effects of pornography in general?
    c) if someone has a particular, perhaps harmful sexual fixation, and they experience erotica or porn on that or other sexual subjects, what is the effect of that experience? I note that deepfakes implies that CSAM can at least be produced without harming children, but, as with violent cartoons, there's no good evidence on the effects on the viewers except for the harm visited on them by law enforcement.


    It's also not hard to work out that automatically inferring fraught things on anyone's phone is going to cause significant harm because:
    4) Any inference of abuse on a child's phone has a very good chance of causing harm to the abused child or even an abused adult. Lots of abusive parents and abusers in general read what's on the abusee's phone, and there are no universally trustworthy parties to send reports to, since abusers are frequently authority figures with their names on the abusee's accounts. Many police agencies and child welfare agencies have also demonstrated untrustworthiness through such acts as untested rape kits and removing dark-skinned children for reasons not related to abuse or neglect.

    5) More generally, anything fraught the phone may infer is unlikely to remain a secret and WILL cause harm to innocent users. Abusive governments will secretly force the phone to infer and report social outliers, such as activists, protestors, or even political criticism, or even gay people, and direct harm to those people. Even if we agree that what is inferred will always be truly harmful and should be punished, errors are intrinsic to the inference and will cause false positives because the phone lacks context -- someone trying to educate or assist with or investigate either real incidents or the inference itself may have verbatim copies of the exact same messages or pictures the bad guys do.

    For an example, let's trace a plausible history of a simple death threat...it originates as a hypothetical in an educational example on a good site, gets copied by a bad actor, transmitted to someone seriously, and that someone makes a decision...to send it to a friend, who then sends it on to the bad actor's employer, who posts the threat for the public to see, and then someone from Apple uses it to train a neural network. Legal consequences ensue, so it shows up in court filings, too. In each case, there's a verbatim copy of the same words, but in only two cases is it possibly reasonable for the computers involved, all possibly phones, to raise flags. Even then, suppose the two directly involved with the threat see each other in person and are, in fact, play acting (perhaps again for an educational demo).

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 7 Sep 2021 @ 7:31am

    hard headed management

    That's never gone wrong before right?
    -President Biden's hard line on the Afghanistan pull out
    -New Coke
    -Windows 11 hardware requirements
    -etc

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.