Tuesday, October 3, 2023
ADVERTISEMENTspot_img

How Apple will scan for child exploitation images on devices, and why it is raising eyebrows

Apple has begun to distribute the iOS 15.2 beta update, which includes one of the Child Safety features that Apple previewed earlier this year, with a minor change. The most recent update adds a Communication Safety feature to the Messages app, which, as the name suggests, is designed to keep kids safe online.

The new feature isn’t turned on by default, so you’ll have to turn it on in the Messages app. The software may apparently detect nudity in photographs transmitted or received by youngsters once the feature is enabled. Furthermore, if a child sends a nude image to someone, it will be immediately blurred and the child will be warned about the content, according to sources.

According to reports, the company would also provide resources for people to reach out to someone they trust for assistance. If a youngster receives a nude image, the app will request that the child not see it. It’s worth noting that when Apple first launched Communication Safety, it stated that if a child saw a nude image in Messages, parents of children under the age of 13 would be notified.

However, Apple appears to have deleted this warning option because it could endanger children if they are exposed to parental violence or abuse. Apple looks to have come up with a better solution to this problem, and will now assist in providing counsel from a trustworthy adult.

According to the corporation, the Messages app analyses image attachments to check for nudity in photos, but this will have no impact on user privacy because the messages will be encrypted end-to-end. Apple will continue to be unable to access Messages.

Furthermore, Apple released another security feature dubbed anti-CSAM a few months ago (child sexual abuse imagery detection). This is distinct from the Communication Safety function, and it will be implemented in the near future.

The Cupertino behemoth hopes to detect child sexual exploitation and trafficking in iCloud images with this feature. However, Apple postponed the debut of this feature in order to address privacy concerns raised by privacy activists. The anti-CSAM tool scans a user’s iCloud Photos against a list of known CSAM to locate child sexual abuse images, which has generated privacy concerns. If the tool finds enough matches, it will notify Apple’s moderators, who will be able to delete the account and report the photographs to law enforcement.

News Desk

The Shining Media is an independent news website and channel, covering updates from the world of Politics, Entertainment, Sports, International, National, and a lot more.

News Desk
News Deskhttp://theshiningmedia.in
The Shining Media is an independent news website and channel, covering updates from the world of Politics, Entertainment, Sports, International, National, and a lot more.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -

Latest Articles

error: