Techno Blender
Digitally Yours.

Apple stops developing CSAM detection system for iPhone users

0 43


Last year, Apple announced that iCloud Photos would be able to detect inappropriate material in users’ photos based on a database of Child Sexual Abuse Material (CSAM) image hashes. While Apple wouldn’t see these photos since it would use on-device processing, it generated a lot of criticism from privacy and security researchers.

Now, after announcing a new Advanced Data Protection for iCloud, the company’s executive Craig Federighi has confirmed that Apple will not roll out the CSAM detection system for iPhone users as the company has stopped developing it.

The information was confirmed during an interview with The Wall Street Journal‘s Joanna Stern. At the time Apple announced the CSAM detection system, privacy and security researchers understood that it could be misused by governments or hackers to gain access to sensitive information on the phone. With that, Federighi announced Apple’s change of plans:

Mr. Federighi said Apple’s focus related to protecting children has been on areas such as communication and giving parents tools to protect children in iMessage. “Child sexual abuse can be headed off before it occurs,” he said. “That’s where we’re putting our energy going forward.”

For example, through its parental-control software, Apple can notify parents who opt in if nude photos are sent or received on a child’s device, but it will no longer develop a system to detect this inappropriate material in users’ photos.

Image source: Apple Inc.

Apart from that, Apple announced today three important features coming to iPhone users in 2023: Advanced Data Protection for iCloud with 23 data categories now being totally end-to-end encrypted, Security Key, which let people use third-party hardware as a two-authenticator-factor, and iMessage Contact Key Verification so some users can discover whether who’s trying to reach them on iMessage are hackers or not.

“As customers have put more and more of their personal information of their lives into their devices, these have become more and more the subject of attacks by advanced actors,” said Craig Federighi, Apple’s senior vice president of software engineering, in an interview.  Some of these actors are going to great lengths to get their hands on the private information of people they have targeted, he said. 




Last year, Apple announced that iCloud Photos would be able to detect inappropriate material in users’ photos based on a database of Child Sexual Abuse Material (CSAM) image hashes. While Apple wouldn’t see these photos since it would use on-device processing, it generated a lot of criticism from privacy and security researchers.

Now, after announcing a new Advanced Data Protection for iCloud, the company’s executive Craig Federighi has confirmed that Apple will not roll out the CSAM detection system for iPhone users as the company has stopped developing it.

The information was confirmed during an interview with The Wall Street Journal‘s Joanna Stern. At the time Apple announced the CSAM detection system, privacy and security researchers understood that it could be misused by governments or hackers to gain access to sensitive information on the phone. With that, Federighi announced Apple’s change of plans:

Mr. Federighi said Apple’s focus related to protecting children has been on areas such as communication and giving parents tools to protect children in iMessage. “Child sexual abuse can be headed off before it occurs,” he said. “That’s where we’re putting our energy going forward.”

For example, through its parental-control software, Apple can notify parents who opt in if nude photos are sent or received on a child’s device, but it will no longer develop a system to detect this inappropriate material in users’ photos.

iCloud Advanced Data ProtectionImage source: Apple Inc.

Apart from that, Apple announced today three important features coming to iPhone users in 2023: Advanced Data Protection for iCloud with 23 data categories now being totally end-to-end encrypted, Security Key, which let people use third-party hardware as a two-authenticator-factor, and iMessage Contact Key Verification so some users can discover whether who’s trying to reach them on iMessage are hackers or not.

“As customers have put more and more of their personal information of their lives into their devices, these have become more and more the subject of attacks by advanced actors,” said Craig Federighi, Apple’s senior vice president of software engineering, in an interview.  Some of these actors are going to great lengths to get their hands on the private information of people they have targeted, he said. 

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment