Apple has announced that it was delayed the plan to launch the features of child sexual abuse (CSAM) for the iPhone in the US after a lot of criticism.
“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take extra time for the coming months to gather input and make improvements before releasing this very important child safety feature,” Apple said.
The CSAM scan feature was announced by Apple last month.
This will come with an iOS 15 update where it can identify images of child pornography stored on the iPhone.
Apple will extend this feature to iPad, Apple Watch and Mac too.
Every time, each Apple device detects images related to child pornography or child abuse, Apple devices will automatically obscure the content and the same will be reported to the Apple server.
This feature will be available in the US.
For iPhone users in the US, after the child harassment content is detected, Apple will automatically warn the national center for lost and exploited children (NCMEC) and law enforcement agencies with the Apple ID user.
Cybersecurity and privacy enthusiasts are worried about this new feature as they feel if Apple can detect porn child in iPhone users with such accuracy stops Apple from the scanning of content or differences of opinion.
Apple can be forced by the government in the future to lurk on potential political opponents, protesters and reporter.
Apple will use matching technology on devices using the HASH DATA HASH CITRA DATA KUDIENTS known by NCMec and other children’s safety organizations.
Before the image is stored in iCloud photo, the on-device matching process is done for the image against the known CSAM hash.
It uses cryptographic technology called a personal set intersection, which determines whether there is a match without revealing the results.
This device creates a cryptographic safety voucher that encodes match results along with additional encrypted data about images.
This voucher was uploaded to iCloud photo along with pictures.