Yoodley is reader-supported. When you buy through links on our site, we may earn an affiliate commission.

Apple detailed the reasons why it left its proposed iCloud photo-scanning tool that could detect child sexual abuse material (CSAM). According to the company, such a project could only give bad actors new flaws to exploit, resulting in bigger privacy and security issues for users. Moreover, the iPhone maker stressed its strong disapproval of the idea of surveillance that opposes its privacy stance.

The statement came after Child safety group Heat Initiative revealed a plan to start a campaign that could push Apple to resume the plan and hopefully implement it soon. Sarah Gardner, the person leading the Heat Initiative, said the group found Apple’s decision to abandon the project “disappointing.”

Erik Neuenschwander, Apple’s director of user privacy and child safety, addressed this and shared the company’s reasons with American magazine WIRED. According to Neuenschwander, implementing the project wouldn’t happen without “ultimately imperiling” Apple customer’s security and privacy, stressing the inevitable issues that could arise in the future. The director then shared that the decision resulted from collaborations with privacy and security researchers, digital rights groups, and child safety advocates.

“Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit,” Neuenschwander wrote to Heat Initiative. “It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”

The statement reflects the Cupertino giant’s view about the surveillance push the United Kingdom wants to implement for all tech companies in the market. To recall, legislators want to make some amendments to the UK’s Investigatory Powers Act (IPA) 2016, which includes the power to request companies to hold back security updates to allow the government to continuously perform its surveillance activities on users’ devices. Apple believes the actions “constitute a serious and direct threat to data security and information privacy.”

Apple stood its ground once again on the issue being raised by the Heat Initiative, saying the start of scanning CSAMs will open the possibility of desiring to implement the same process on its messaging systems. With this, Neuenschwander instead stressed the company’s continuous push to promote its Communication Safety offering, an opt-in feature in the Messages app on the iPhone, iPad, Mac, and Apple Watch. In a nutshell, this automatically blurs nude images and warns users about the material. Under the comforting message reading “You’re not alone,” three options will be given to the child user: Message Someone, Other Ways to Get Help, and Block Contact. According to Apple, the process of blurring, analyzing, and identifying nude images and attachments utilizes on-device machine learning, which means “Apple doesn’t get access to the photos.” It is generally off by default but can be activated on child accounts signed in with their Apple ID.

Apple hopes to focus more on using the said on–device feature instead of the earlier proposed tool for the entire iCloud platform, which will affect every Apple customer in case a flaw arises. In the WIRED report, the company also shared the participation of third-party app developers to implement Communication Safety API in their apps, which includes Discord.


Please enter your comment!
Please enter your name here