Apple has announced that it will delay the rollout of CSAM detection features following strong opposition from customers, privacy groups, and security researchers, reports TechCrunch. Nearly one hundred policy and rights groups, including the American Civil Liberties Union and Electronic Frontier Foundation, have called on Apple to abandon the technology.
Today, Apple issued the following statement:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Give Apple's statement, it appears the company is still planning to launch these features which the EFF has described as essentially a backdoor into its photos and messaging systems.
You can download the iClarified app or follow iClarified on Twitter, Facebook, YouTube, and RSS for updates.
Read More
Today, Apple issued the following statement:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Give Apple's statement, it appears the company is still planning to launch these features which the EFF has described as essentially a backdoor into its photos and messaging systems.
You can download the iClarified app or follow iClarified on Twitter, Facebook, YouTube, and RSS for updates.
Read More