Apple’s new automatic ‘CSAM detection’ system will be disabled when iCloud is turned off 

In Fall, Apple will introduce a new’ CSAM detection’ hashing system to contain the spread of Child Sexual Abuse Material (CSAM) online. Implemented across devices, the new on-device matching system will scan iCloud Photos for known CSM images and will enable “Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.”

The new CSAM detection system will be automatic and not opt-in like Communication Safety in Messages. But, the company confirms that it will not be able to run CSAM detection if iCloud Photos is turned off.

Apple

For Apple’s CSAM detection system to scan photos, iCloud Photos must be enabled

As detailed in the announcement, the company CSAM detection system will be a four-step process involving new technologies to perform on-device matching against known CSAM hashes, creating safety vouchers, uploading vouchers to iCloud Photos, and use threshold secret sharing to ensure that “the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content.” Therefore, as the function of the new hashing system relies heavily on the iCloud Photos, the company confirms it will not work if the feature is turned off.

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.

In a FAQ document, the company provides more clarification on the functionality of the new scanning system.

CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.

A cryptography expert Matthew Green accurately broke the news of Apple’s new client-side photos hashing system before the official announcement and warns against its negative implication on users’ privacy and security. He argues that all pictures on iCloud Photos are not end-to-end encrypted, and are stored in an encrypted form on the company’s server farms. That means a law enforcement agency can subpoena Apple to see specific user’s photos library. He highlighted four issues with the new CSAM fingerprinting: false positives, collision attacks, misuse by authoritarian governments, and breach of privacy.

Read the complete documentation on new child safety tools here. And let us know your opinion on the new child safety measure in the comments.

About the Author

Addicted to social media and in love with iPhone, started blogging as a hobby. And now it's my passion for every day is a new learning experience. Hopefully, manufacturers will continue to use innovative solutions and we will keep on letting you know about them.

4 comments

Leave a comment