Apple has long worked to make the devices in its ecosystem safe for all users, including children. Now Cupertino has announced new opportunities to protect children from sexual abuse and more.
Child protection in iMessage
Includes functionality that will warn the child twice about photos that are sexual in nature. Step 1: Your sexually explicit photo will be blurred. Second, if a child tries to watch it, they will be warned that a view notification will be sent to a member of their family.
Apple emphasizes that the photos are viewed by AI, so the company does not have any access to them.
Analyze iCloud photos for child pornography
Before any photo is uploaded to iCloud servers, the AI will also scan all users’ photos to check for child pornography. If such cases are detected, the system can send a notification to the appropriate service.
Of course, the company has assured users that the checks will be carried out in accordance with the terms of the confidentiality agreement.
Siri against child abuse
Apple also updated Siri and search to allow users to report kidnappings and child abuse cases.
Security updates should arrive with public releases of iOS 15, iPadOS 15, and macOS 12 Monterey in the fall. Then the new iPhone 13 will be presented.