Apple is discontinuing its plan to scan your photos into CSAM Here’s what

In August 2021, Apple announced a plan to scan photos users had stored in iCloud for child sexual abuse material (CSAM). The tool was intended to protect privacy and allow the company to flag potentially problematic and offensive content without revealing anything else. But the initiative was controversial and quickly drew widespread criticism from privacy and security researchers and digital rights groups concerned that the monitoring feature itself could be exploited to undermine the privacy and security of iCloud users around the world. In early September 2021, Apple said it would pause the feature’s rollout to “gather input and make improvements before releasing these critically important child safety features.” In other words, a launch was still imminent. Now the company says that in response to the feedback and guidance received, the iCloud Photos CSAM detection tool is dead.

Instead, Apple told WIRED this week, it is focusing its anti-CSAM efforts and investments on its “Communication Security” features, which the company initially announced in August 2021 and launched last December. Parents and guardians can sign up for protection through iCloud family accounts. The features work across Siri, Apple’s Spotlight search, and Safari search to alert if someone is looking at or searching for child sexual abuse material and provide on-the-spot resources to report the content and seek help. In addition, the core of the security is Communication Safety for Messages, which caregivers can set up to give children a warning and resources if they receive or attempt to send photos that contain nudity. The aim is to stop child exploitation before it happens or becomes established and to reduce the emergence of new CSAM.

“Following extensive consultations with experts to collect feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communications Safety feature that we first made available in December 2021,” the company told WIRED in a statement. “We have further decided not to proceed with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies searching personal data, and we will continue to work with governments, children’s advocates and other companies to help protect young people, preserve their right to privacy and make the internet a safer place for children and for all of us. ”

Apple’s CSAM update comes along with today’s announcement that the company is vastly expanding its end-to-end encryption offerings for iCloud, including adding protection for backups and photos stored on the cloud service. Child safety experts and technologists committed to combating CSAM have often opposed wider deployment of end-to-end encryption because it makes user data inaccessible to technology companies, making it more difficult for them to scan and flag CSAM. Law enforcement agencies around the world have similarly cited the dire problem of child sexual abuse by opposing the use and expansion of end-to-end encryption, though many of these agencies have historically been hostile to end-to-end encryption. end encryption in general because it can make some investigations more challenging. However, research has consistently shown that end-to-end encryption is an essential security tool for the protection of human rights and that the drawbacks of implementing it outweigh the benefits.

Communication Safety for Messages is opt-in and analyzes image attachments that users send and receive on their devices to determine if a photo contains nudity. The feature is designed so that Apple never gets access to the messages, the end-to-end encryption that Messages provides is never broken, and Apple won’t even find out that a device has detected nudity.

The company told WIRED that while it’s not yet ready to announce a specific timeline for expanding its communications security features, the company is working to add the ability to detect nudity in videos sent through Messages when security is compromised. is enabled. The company also plans to expand its offerings beyond Messages to its other communications applications. Ultimately, the goal is to enable third-party developers to incorporate the Communication Safety tools into their own applications. The more features that can spread, Apple says, the more likely it is that children will get the information and support they need before being exploited.

“Potential child exploitation can be stopped before it happens by providing parents with opt-in tools to help protect their children from insecure communications,” the company said in its statement. “Apple is committed to developing innovative privacy protection solutions to combat child sexual abuse material and protect children while meeting the unique privacy needs of personal communications and data storage.”

Like other companies that have publicly struggled with how to handle CSAM, including Meta, Apple told WIRED it plans to continue working with child safety experts to make it as easy as possible for its users to avoid exploitative exploits. report content and situations to interest groups and the police.

Countering CSAM is a complicated and nuanced endeavor with extremely high stakes for children around the world, and it’s still unknown how much traction Apple’s gamble on proactive intervention will gain. But tech giants are walking a fine line as they work to balance CSAM detection and user privacy.