Apple Removes References to Controversial CSAM Scanning From Website
Posted December 15, 2021 at 3:17pm by iClarified
Apple has quietly removed all references to a previously announced CSAM scanning feature from its Child Safety webpage.
On August 5, Apple announced it would starting scanning user's photos and messages for child pornography. The move was viewed as a shocking about-face for users who have relied on the company's leadership in privacy and security. Outrage was immediate from users, researchers, and organizations like the Electronic Frontier Foundation.
We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses. All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.
This resulted in Apple pausing its plans on September 3.
"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
References to CSAM have now been scrubbed from the Child Safety webpage at Apple.com, suggesting the company may have abandoned its plan. We'll let you know if an official confirmation is provided.
Note that Apple released iOS 15.2 this week which includes a communication safety setting that gives parents the ability to enable warnings for children when they receive or send photos that contain nudity.
Please download the iClarified app or follow iClarified on Twitter, Facebook, YouTube, and RSS for more updates.
On August 5, Apple announced it would starting scanning user's photos and messages for child pornography. The move was viewed as a shocking about-face for users who have relied on the company's leadership in privacy and security. Outrage was immediate from users, researchers, and organizations like the Electronic Frontier Foundation.
We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses. All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.
This resulted in Apple pausing its plans on September 3.
"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
References to CSAM have now been scrubbed from the Child Safety webpage at Apple.com, suggesting the company may have abandoned its plan. We'll let you know if an official confirmation is provided.
Note that Apple released iOS 15.2 this week which includes a communication safety setting that gives parents the ability to enable warnings for children when they receive or send photos that contain nudity.
Please download the iClarified app or follow iClarified on Twitter, Facebook, YouTube, and RSS for more updates.