Apple's New Scanning Tools Raising More Concerns, Even Inside Apple
from the take-a-step-back dept
Last week, we wrote about our concerns about Apple's newly announced scanning efforts that the company claimed were to protect children. Lots of security experts raised concerns about how this was being rolled out -- and none of the complaints were meant to take away from the very real and legitimate concerns about child sexual abuse. Security guru Alex Stamos wrote one of the most thoughtful threads about the whole thing, noting (as with so many of these issues) that there are no easy answers here. I highly recommend you read the entire thread, but here's a little snippet:
Likewise, I am both happy to see Apple finally take some responsibility for the impacts of their massive communication platform, and frustrated with the way they went about it. They both moved the ball forward technically while hurting the overall effort to find policy balance.
— Alex Stamos (@alexstamos) August 7, 2021
Similar to many of these debates involving nuanced tech policy issues with no easy answers, a big part of the problem here is, as Stamos notes in his thread, there are tons of conversations happening about the nuances and tradeoffs, and even though Apple's approach is not as disastrous and dangerous as it could have been (i.e., clearly a lot of thought was put by the team at Apple into minimizing many -- though not all -- of the risks here), this approach was still done without talking to the many, many people who have been trying to find a reasonable balance here. And that messes a lot of stuff up.
Apple was invited but declined to participate in these discussions, and with this announcement they just busted into the balancing debate and pushed everybody into the furthest corners with no public consultation or debate. pic.twitter.com/a3OTxkzH43
— Alex Stamos (@alexstamos) August 7, 2021
Stamos, along with computer science professor/security guy Matt Green, have now published a good piece in the NY Times highlighting their concerns. The article notes there is less concern about the iMessage child safety features (Apple's initial description of that seemed much more concerning, but the details show why it's not that bad). But the photo scanning on the phone raises a lot of concerns:
But the other technology, which allows Apple to scan the photos on your phone, is more alarming. While Apple has vowed to use this technology to search only for child sexual abuse material, and only if your photos are uploaded to iCloud Photos, nothing in principle prevents this sort of technology from being used for other purposes and without your consent. It is reasonable to wonder if law enforcement in the United States could compel Apple (or any other company that develops such capacities) to use this technology to detect other kinds of images or documents stored on people’s computers or phones.
While Apple is introducing the child sexual abuse detection feature only in the United States for now, it is not hard to imagine that foreign governments will be eager to use this sort of tool to monitor other aspects of their citizens’ lives — and might pressure Apple to comply. Apple does not have a good record of resisting such pressure in China, for example, having moved Chinese citizens’ data to Chinese government servers. Even some democracies criminalize broad categories of hate speech and blasphemy. Would Apple be able to resist the demands of legitimately elected governments to use this technology to help enforce those laws?
Another worry is that the new technology has not been sufficiently tested. The tool relies on a new algorithm designed to recognize known child sexual abuse images, even if they have been slightly altered. Apple says this algorithm is extremely unlikely to accidentally flag legitimate content, and it has added some safeguards, including having Apple employees review images before forwarding them to the National Center for Missing and Exploited Children. But Apple has allowed few if any independent computer scientists to test its algorithm.
The computer science and policymaking communities have spent years considering the kinds of problems raised by this sort of technology, trying to find a proper balance between public safety and individual privacy. The Apple plan upends all of that deliberation. Apple has more than one billion devices in the world, so its decisions affect the security plans of every government and every other technology company. Apple has now sent a clear message that it is safe to build and use systems that directly scan people’s personal phones for prohibited content.
In a separate thread, Stamos has a suggested path forward for Apple, which involves pumping the brakes quite a bit on some of these features.
Meanwhile, Reuters revealed on Thursday that inside Apple there are widespread concerns as well.
Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.
Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.
The article notes that many of the concerns are coming from outside of the security team at Apple -- suggesting that the concerns are more about perception than they are technical. But, really, this highlights the same problem that Stamos noted earlier: Apple's standard operating procedure of doing everything alone, and then also doing "surprise" announcements regarding products. That's great for a new gadget in your pocket. It's not so great for dealing with a massively challenging and very legitimate problem with no easy answers, where getting things even a little wrong can have significant negative consequences.
Unlike many companies that rush out offerings that do more harm than good, I do think that Apple did think this through internally with lots of smart and thoughtful people. But these are problems and challenges that go beyond just one company -- and Apple's famously insular approach is exactly the wrong thing for this sort of challenge.
Filed Under: backdoors, csam, encryption, icloud, iphones, privacy, scanning, security, surveillance
Companies: apple