Apple launches a new iPhone feature for child safety: all details – News2IN
Gadgets

Apple launches a new iPhone feature for child safety: all details

Apple launches a new iPhone feature for child safety: all details
Written by news2in

Apple has announced the safety features of new children on the iPhone, iPad and Mac.
New features – developed in relations with child safety experts – will focus on three fields.
Parents will have more control and will be able to play a more informed role in helping their children navigate communication online.
Apple wants to limit the spread of child or child sexual harassment on its device.
Also, Apple brings new features to search and remember child safety.
The following is a display on new features in detail: new safety features in the iMessageApple – iMessage messaging application – will get new tools that will warn their children and parents when receiving or send sexually explicit photos.
For example, when the content is received, the photo will automatically run away and pop up appears.
A child will be given “useful resources, and it is not difficult if they don’t want to see this photo,” Apple explained.
At the same time, the child’s parents will get a message if they finally see the picture.
Similar protection is available if a child tries to send sexually explicit photos.
Children will be warned before the photo is sent, and parents can receive a message if the child chooses to send it.
Apple says that it uses an on-device learning device to analyze images and see them sexually explicitly or not.
“This feature is designed so that Apple does not get access to messages,” said the company.
Detect CSAM in iCloud on iPhone, iPad and Maccsam, especially content that describes sexually explicit activities involving children.
Apple said that it has made user privacy in mind when designing features that will allow it to detect CSAM images that can be stored in iCloud photos.
Before the image is stored in iCloud photo, the on-device matching process is done for the image against the known CSAM hash.
“This matching process is powered by cryptographic technology called a personal set intersection, which determines whether there is a match without revealing the results,” Apple explained.
Security features for children in Siri and SearchSiri and searches on Apple devices also get new tools to help children and parents stay safe online and get help with unsafe situations.
For example, users who ask Siri how they can report CSAM or child exploitation will be designated to the resources for where and how to submit a report.
Furthermore, Siri and search are also updated to intervene when users do a search for queries related to CSAM.
“This intervention will explain to interesting users on this topic dangerous and problematic, and provide resources from partners to get assistance with this problem,” Apple added.
The security feature of this new child will come this year with iOS 15, iPados 15 and MacOS Monterey.
Apple’s new software update is expected to land in September this year.

About the author

news2in