By the end of 2022 Apple cancels initiative to discover child sexual abuse photos stored in iCloud. Since its announcement, the plan has generated a lot of controversy because, despite its good intentions, it could lead to unimaginable privacy issues. And while the people of Cupertino stayed true to their original strategy despite criticism from the public and dozens of civic organizations, they ended up throwing everything overboard with no further explanation. Until now.
For the first time, Apple has explained why it stopped the initiative to scan photos stored in iCloud for child abuse material. Eric Neuenschwandervice president of user privacy and child safety at the California-based firm, detailed this in a letter sent to Sarah Gardner, CEO of the Heat Initiative, a child safety group (via WIRED).
Gardner initially sent an email to Tim Cook and other Apple executives expressing dissatisfaction with the initial delay and subsequent cancellation of a plan to expose child abuse photos on their devices and cloud services. In addition, he told them that his group would initiate a public pressure campaign against them to implement a system that serves to detect, report, and eliminate child pornography stored in iCloud. And also to promote the creation of a mechanism that allows users of their products to report the existence of these types of materials.
Neuenschwander’s response to this statement is very interesting. The CEO openly admitted that Apple failed to explain the necessary technical aspects develop a child abuse photo detection system that is effective but does not compromise user privacy.
Apple has consulted with technology, security and human rights experts and has looked into scanning technology “from virtually every possible angle.” And the conclusion was clear: what was to be achieved was not possible without concessions in terms of confidentiality.
Apple and the cancellation of its initiative to detect child abuse photos in iCloud
In an email sent to the Heat Initiative leader, Neuenschwander notes that child pornography is disgusting and lists the many steps Apple has taken to protect minors using its devices. However, he then goes into detail. why the initiative to discover this type of content in iCloud was canceled.
“Companies regularly use cloud-based personal data scanning to monetize their users’ information. While some companies condone this practice, we have chosen a completely different path: one that prioritizes the security and privacy of our users. We estimate that scanning iCloud content privately held by each user will have significant unforeseen consequences for them. […] Scanning information held privately in iCloud will create new threat vectors for data thieves to find and exploit.
It will also create the possibility of unforeseen consequences. For example, searching for one type of content will open the door to mass surveillance and may lead to a desire to scan other encrypted messaging systems for other categories and types of content (such as images, video, text, or audio). How can users be sure that a tool for one type of surveillance has not been reconfigured to monitor other content, such as political activity or religious persecution? The tools of mass surveillance have widespread negative consequences for freedom of expression and, as a result, for democracy in general. In addition, the development of this technology for one government may require it to be applied in other countries with new types of data.
The scanning systems are not reliable either, and there is documented evidence on other platforms that innocent people were caught up in dystopian networks that fell prey to them when they did nothing but share perfectly normal and appropriate pictures of their children.”
An excerpt from an email sent by Eric Neuenschwander to Sarah Gardner.
Apple doesn’t often talk openly about decisions regarding its product or service strategy, but it’s clear that its initiative to expose iCloud child abuse photos deserves it. It is also likely that Neuenschwander’s message was not intended for public release. Although this at least gives a clearer picture What did the apple people put on the scale by telling this story?.
One of Apple’s most controversial plans

Let’s remember that Apple initially stuck to its decision to scan devices like the iPhone for pictures of child abuse. Actually, Craig Federighi, the company’s vice president of software development, admitted the announcement caused confusion. And he even tried to explain the company’s approach to maintaining user privacy.
Then the manager mentioned that the system would be based on CSAM hashes. What does it mean? Apple will use a database of child sexual abuse photographs created by child protection organizations and independently verified. Implementation technology would sound the alarm only when a certain number of matches have been found.
“If and only if you reach the threshold of about 30 matching known child pornography photos, then Apple will know about your account and about them. And at that moment, she will only know about these, and not about any of your other images. This is not an analysis of why you had a photo of your son in the bath. Or, for that matter, whether you had a pornographic image of any other kind. It literally matches only the exact fingerprints of specific pornography images. Famous kids,” Federighi explained Wall Street Magazine in August 2021.
Despite this ironclad initial stance, Apple has come to the conclusion that it was not possible to guarantee confidentiality looking for pictures of child abuse. Even if the information was processed, for example, inside the iPhone, and not on iCloud servers.
Thanks to Neuenschwander’s explanation, we now at least have more accurate data on this story. However, let’s not forget that Apple continues to work on improve the protection of minors using their devices. One of the most high-profile cases is the nudity warnings for images sent via iMessage.
Source: Hiper Textual

I’m Ben Stock, a highly experienced and passionate journalist with a career in the news industry spanning more than 10 years. I specialize in writing content for websites, including researching and interviewing sources to produce engaging articles. My current role is as an author at Gadget Onus, where I mainly cover the mobile section.