Apple might crack down on child abuse images via a new client-side photo hashing system

A cryptography expert Matthew Green claims that Apple will soon release a new client-side photo hashing system to detect child abuse images on users’ photo libraries. The new system is designed as an effort by the company to flag child abusers or pedophiles.

Hashing is a process to map one piece of data to another piece of data via a hash code or a hash. The function is to match or map objects to hash codes known as a hash function, cryptographic and non-cryptographic.



Apple’s alleged client-side hashing system to report child abuse content might not be a good idea

As reported, Apple will introduce a set of fingerprints/ hashcodes representing illegal content in the iPhone. The hashcode will scan each photo in the users’ camera roll to identify illicit content like child pornography and other abusive material. To maintain users’ privacy, all mapping will be done on the device like machine learning features, and only matched IDs will be reported for human review.

However, Green argues that the new system can be easily exploited and flag false positives. In a detailed Twitter thread, Green highlights that Apple’s new hashing system would open flood gates for public surveillance by miscreants and government agencies in the name of looking for “harmful” content.


Users’ messages and most data are end-to-end encrypted which provides a necessary layer of protection against unwarranted spying. And Apple’s hashing system will give anyone creating a hashing list access to encrypted data by “surveilling every image anyone sends.” And in the wrong hands, it can provide material for blackmail and defamation purposes.

But there are worse things than worrying about Apple being malicious. I mentioned that these perceptual hash functions were “imprecise”. This is on purpose. They’re designed to find images that look like the bad images, even if they’ve been resized, compressed, etc.

This means that, depending on how they work, it might be possible for someone to make problematic images that “match” entirely harmless images. Like political images shared by persecuted groups. These harmless images would be reported to the provider.

In conclusion, Green believes it is a mistake which will blow out of proportion a little too late for any remedy. What is your opinion on the new hashing system? Let us know in the comments.

Read More:

About the Author

Addicted to social media and in love with iPhone, started blogging as a hobby. And now it's my passion for every day is a new learning experience. Hopefully, manufacturers will continue to use innovative solutions and we will keep on letting you know about them.