Apple is Already Scanning iCloud Emails for CSAM
Posted August 23, 2021 at 6:50pm by iClarified
Apple has confirmed that it is already scanning iCloud Mail for child pornography, reports 9to5Mac. The company has been doing so since 2019.
An archived version of Apple’s child safety page hinted at this but the company hadn't clearly stated that it was scanning user emails.
Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.
Apple suggested to 9to5Mac that it was doing limited scanning of other user data, though it would not reveal what that 'other data' was.
Scanning of unencrypted user emails is far less controversial than Apple's plan to scan user photos in iOS. Over the past week, numerous privacy groups and security researchers have spoken out against Apple's plan to scan user photos and messages. The EFF has also launched a petition which you can sign here.
Please download the iClarified app or follow iClarified on Twitter, Facebook, YouTube, and RSS for updates.
Read More
An archived version of Apple’s child safety page hinted at this but the company hadn't clearly stated that it was scanning user emails.
Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.
Apple suggested to 9to5Mac that it was doing limited scanning of other user data, though it would not reveal what that 'other data' was.
Scanning of unencrypted user emails is far less controversial than Apple's plan to scan user photos in iOS. Over the past week, numerous privacy groups and security researchers have spoken out against Apple's plan to scan user photos and messages. The EFF has also launched a petition which you can sign here.
Please download the iClarified app or follow iClarified on Twitter, Facebook, YouTube, and RSS for updates.
Read More