The Electronic Frontier Foundation says Apple's announcement that it will starting scanning user's photos and messages is a shocking about-face for users who have relied on the company's leadership in privacy and security.
Earlier today, Apple disclosed that it will soon begin scanning user photos for child pornography. Additionally, it will also scan messages to and from minors for sexual content. To accomplish this, the company has essentially built a backdoor into photos and messages, making privacy on the iPhone a thing of the past.
We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses. All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.
Until now, Apple has said iMessage is a secure messaging system where no one but the user and the recipient can read messages. End-to-end encryption was supposed to prevent Apple's servers from knowing the content of a message. However, if the server has a way to reveal information about the contents of a message, it's no longer end-to-end encryption.
The EFF notes that "In this case, while Apple will never see the images sent or received by the user, it has still created the classifier that scans the images that would provide the notifications to the parent. Therefore, it would now be possible for Apple to add new training data to the classifier sent to users’ devices or send notifications to a wider audience, easily censoring and chilling speech."
More details in the full report linked below. Apple says it plans to implement these changes in iOS 15. Please download the iClarified app or follow iClarified on Twitter, Facebook, YouTube, and RSS for the latest updates.
Read More
Earlier today, Apple disclosed that it will soon begin scanning user photos for child pornography. Additionally, it will also scan messages to and from minors for sexual content. To accomplish this, the company has essentially built a backdoor into photos and messages, making privacy on the iPhone a thing of the past.
We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses. All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.
Until now, Apple has said iMessage is a secure messaging system where no one but the user and the recipient can read messages. End-to-end encryption was supposed to prevent Apple's servers from knowing the content of a message. However, if the server has a way to reveal information about the contents of a message, it's no longer end-to-end encryption.
The EFF notes that "In this case, while Apple will never see the images sent or received by the user, it has still created the classifier that scans the images that would provide the notifications to the parent. Therefore, it would now be possible for Apple to add new training data to the classifier sent to users’ devices or send notifications to a wider audience, easily censoring and chilling speech."
More details in the full report linked below. Apple says it plans to implement these changes in iOS 15. Please download the iClarified app or follow iClarified on Twitter, Facebook, YouTube, and RSS for the latest updates.
Read More