About a month ago, Apple announced that it was working against child abuse by scanning iCloud images. But the news was met with much criticism, and now the Cupertinoites say they are delaying activating the system to optimize it.
Apple has released a statement for the website 9to5Mac “Last month we unveiled a program to protect children from abuses that use communication tools to harm children, and we wanted to stop the publication of child sexual content (CSAM),” he said. “According to feedback from users, civic groups, researchers and others, we decided to spend more time in the coming months doing more research and optimizing before these important features are available.”
The new child protection feature was to be released as part of the iPadOS 15, iOS 15 and macOS Monterey. No new release date has been announced for this feature. Unfortunately, Apple’s statement today did not provide further details on any changes that might be made to the app.
According to a previous announcement by Apple, the CSAM detection system is designed with users’ privacy in mind. Instead of scanning images in Apple, the system uses a known database of CSAM image hashes to scan files on the user’s own device.
This review is done before the images are stored in iCloud Photos and is completed with the help of a special encryption technology called intersection of private data sets. In the process, the iPhone creates an encryption security certificate that encrypts the test result along with other image information. This certificate is stored with the image in iCloud Photos.
The introduction of this system was met with strong criticism and feedback from users and fans of privacy. However, Apple has repeatedly defended this feature, saying that by implementing this system, it can better protect privacy than companies such as Google and Facebook.