Last year, Apple announced plans to help combat child sexual abuse in several areas. As part of iOS 15.2, Apple implemented a new parental control feature for Messages to help prevent children from seeing or sending sexual imagery, and boosted Siri and Search to provide resources targeted at children who may be victims. However, its most controversial feature which would scan your photos as they are being uploaded to iCloud to see if they match known child sexual abuse material (CSAM), was delayed.
Though Apple went through major steps to make sure that users’ privacy would be protected and that outside actors (like government agencies) could not use the technology to make Apple scan for pictures of things like dissidents, it raised a lot of red flags in the privacy community. From the moment it was announced, people were concerned about the prospect of “mass surveillance” and allowing Apple to run a system that could be used to scan for
On Wednesday, in an interview with the Wall Street Journal, Apple’s senior VP of software engineering Craig Federighi confirmed that the CSAM scanning feature has been scrapped. The other features have been in place in iOS since last year. Federighi said, “Child sexual abuse can be headed off before it occurs. That’s where we’re putting our energy going forward.”
In a statement to Wired, Apple elaborated: “We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.”
It’s possible that the system reached technical hurdles, too. Apple announced Wednesday that it will provide an option for end-to-end encryption on most iCloud data, including Photos, which could have made the CSAM scanning too difficult to properly implement.
Source : Macworld