Apple will be rumored to be using new technology to see what type of photo you have in your iPhone gallery to identify whether you save child pornographic content or not.
Apple will use the hash algorithm to check photos stored on iPhone users and will use photo identification software in the backend to recognize whether it looks like a pornography of a child or other type of abuse.
Apple will report launched “client side tool for CSAM scanning”.
This means that your iPhone will automatically download this hash algorithm that will check each photo saved on your iPhone to identify whether there are illegal content.
If the algorithm places an unpleasant content, the iPhone will automatically report it to the Apple server if there are too many photos.
I already have an independent confirmation of many people who release the client side tool for CSAM scanning …
https://t.co/LDZ40X4Cow- Matthew Green (@MatThew_D_Green) 1628121584000 Government is a hashing algorithm is not always accurate and may provide false positive reports.
Also, while Apple claims only to detect illegal child harassment content can be possible that Apple can mark other types of content as well.
“Apple allows the government to control the fingerprint content database, maybe they can use the system to suppress political activism,” according to report by 9to5 Mac.
Even if Apple saves content on iCloud in an encrypted manner, the problem is Apple also has a key to decrypt.
This means that if Apple is forced by any law enforcement agency, Apple allows the government to see all specific user photos.
“The way Apple launches this, they will start with non-E2e photos that have been distributed by people with Cloud.
So don’t” hurt “anyone’s privacy.
But you have to ask why anyone will develop a system like this if I scan photos e2 not The aim, “said Cryptography and Matthew Green security expert.
Governments around the world have requested technology to destroy E2E communication when law enforcement agencies have difficulty decrypting the communication.
The only hope here is that Apple will not allow the system to be misused.
“But even if you believe Apple will not let this tool be misused, there is still a lot to be considered.
This system depends on the database of the” problem media hash “that you, as consumers, cannot review,” Green added.