Heat Initiative anti-CSAM campaign

Yoodley is reader-supported. When you buy through links on our site, we may earn an affiliate commission.

Alongside the official launch of a campaign that will push Apple to continue its work on a scanning tool for iCloud to detect child sexual abuse material (CSAM), a new report indicates that Apple investors will add to the weight of the campaign pressure by submitting a shareholder resolution.

This follows a report about the announcement of the child safety group Heat Initiative’s plan for a campaign to force Apple into completing the project. According to Sarah Gardner, the person leading the Heat Initiative, the group found Apple’s decision to abandon the idea “disappointing.” Now, the campaign has finally launched a dedicated website to further collect support from other individuals and groups. Different research materials are posted on the site alongside a huge statement: “Child sexual abuse is stored on iCloud,” the website banner reads. “Apple allows it.”

Aside from this, a report from The New York Times revealed that investors are also pushing Apple to take action regarding the matter. The report shares that a group of investors with almost $1 trillion under management wants Apple “to publicly report the number of abusive images that it catches across its devices and services.”

Moreover, the NYT named two investors — Belgian asset manager Degroof Petercam and Catholic investment firm Christian Brothers Investment Services — expected to file a shareholder proposal this month. According to a representative of the Degroof Petercam, the proposal would “wake up” Apple to take the matter seriously. The report, nonetheless, stresses that Apple has already started its move, with its privacy executives meeting with the group of investors in early August. After this, the company responded to Heat Initiative’s letter, explaining why it abandoned the project.

As explained by Erik Neuenschwander, Apple’s director of user privacy and child safety, implementing the project wouldn’t happen without “ultimately imperiling” Apple customers’ security and privacy, stressing the inevitable issues that could arise in the future. The director then shared that the decision resulted from collaborations with privacy and security researchers, digital rights groups, and child safety advocates.

In the end, Neuenschwander underscored Apple’s decision to instead resort to its Communication Safety feature, an opt-in feature in the Messages app on the iPhone, iPad, Mac, and Apple Watch that automatically blurs nude images and warns users about the material. Apple hopes to focus more on using the said on–device feature instead of the earlier proposed tool for the entire iCloud platform, which will affect every Apple customer if a flaw arises.

LEAVE A REPLY

Please enter your comment!
Please enter your name here