Apple has begun rolling out one of the tools it announced last year globally to ensure that those under the age of 13 using an iPhone, iPad, or Mac are protected. The company has activated in the UK, Canada, Australia, and New Zealand, among others. nudity alerts in messagesa feature also known as “message communication security” that aims to prevent children from seeing explicit images received through the messaging service.
This tool, which needs to be activated manually through the settings, scans photos sent via Messages. for the purpose of detecting nudity or any other sexually explicit content. All this, yes, while maintaining end-to-end encryption of media files. If the system detects that an image is explicit because it contains nudity or any other type of sexual activity, it will blur the image and display a chat message warning that the image may be sensitive.
Messages uses on-device machine learning to analyze attached images and determine if a photo contains nudity. The function is designed in such a way that Apple does not have access to photos.
Manzana.
Apple will offer a series of measures to minors who received explicit images
The app will also allow the user to choose between texting an adult contact to alert them that they have received a sensitive image, or blocking the contact. It will also display the “other ways to find help” label. In the latter case, Apple will redirect the user to a website with resources and advice on how not to continue the conversation.
If the user finally decides to view the photo, the Messages app will warn again about the type of content the image can handle. This will also offer the option to ignore him or contact an adult.
Apple, on the other hand, has also started activating an optional child lock feature. In this case, it is designed to search from Spotlight, Siri and Safari, depending on The Verge. When a user accesses one of these three services for search for topics related to child sexual abusethe results that will be displayed will suggest content and safety resources.
At the same time, the CSAM function continues without to be available. Please be aware that this tool scans iPhone photos before uploading them to iCloud to detect child-specific sexual content. As a result, Apple delayed the launch due to numerous concerns from users and security and privacy experts.
Source: Hiper Textual
I am Bret Jackson, a professional journalist and author for Gadget Onus, where I specialize in writing about the gaming industry. With over 6 years of experience in my field, I have built up an extensive portfolio that ranges from reviews to interviews with top figures within the industry. My work has been featured on various news sites, providing readers with insightful analysis regarding the current state of gaming culture.