Apple announced it is pushing back the release of its Child Sexual Abuse Material (CSAM) tools for making improvements.
The step has been taken due to feedback from critics, researchers and advocacy groups.
The feature includes and analyzes iCloud Photos for known CSAM that led to privacy concerns.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple told 9to5Mac as quoted by Engadget.
They added: “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple announced earlier that it
would implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse.
Detection of child abuse image uploads sufficient to guard against false positiveswill trigger a human review of and report of the user to law enforcement, Apple said. It said the system is designed to reduce false positives to one in one trillion.
Apple’s new system seeks to address requests from law enforcement to help stem child sexual abuse while also respecting privacy and security practices that are a core tenet of the company’s brand.
But some privacy advocates said the system could open the door to monitoring of political speech or other content on iPhones.
Leave a Comment