Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,482
37,743


Starting with iOS 17.2, which is currently in beta and is expected to be released in December, Apple's opt-in Sensitive Content Warning feature will work with Contact Posters in both the Contacts and Phone apps, as well as stickers in the Messages app.

Sensitive-Content-Feature.jpg

When turned on, the Sensitive Content Warning feature uses on-device machine learning to analyze photos and videos, and blurs photos or videos with any detected nudity before users view them. Apple says it does not receive any indication that nudity was detected, and does not gain access to the photos or videos as a result.Sensitive Content Warnings are also available for messages in the Messages app, AirDrop transfers, and FaceTime video messages. The feature launched earlier this year across the iPhone, iPad, Mac, and Apple Watch. Apple has offered a similar Communication Safety feature for children across the same devices since 2021.

Article Link: iOS 17.2 to Expand Sensitive Content Warnings on iPhone
 
I have a doubt about this system. Has been iOS using it for the photos widgets and the Lock Screen featured pics? Because from all my gallery it barely featured NSFW pics
 
This makes sense, considering that people can send updated pictures and contact covers to other people. It’s kinda like sanitizing user input, I’d say. In fact, it’s a feature Apple might want to add to Safari at some point, for child protection but also optionally profile pictures on social media (I see a fair number of spam bots with adult content in the profile picture on YouTube, and used to see them years ago when I used Twitter). I reckon they could add it as an option per website, even.
 
CREEPY!! CSAM is what they will implement next?
This is something completely different from the CSAM system, and it’s done entirely on phone and is opt-in. It certainly makes sense for parental controls, and I’m sure that women don’t generally appreciate receiving unsolicited male nudity. I’m surprised anyone would consider this controversial.
 
I have an 18-month-old Golden Retriever who likes to sleep on her back, limbs splayed out - like she's the queen of the world and has no cares nor fears.

I am waiting for the time when Apple's sensitive content warnings start flagging the pictures we take of her.

It’s opt in, so…

CREEPY!! CSAM is what they will implement next?

Give me a break.
 
So it’s scanning and “learning” from my personal photos local on my device? I’d suspect it’s reporting back to the “mothership” too?

Geee this never could be used in a very bad and very creepy way, never I tell you 😂



I thought google was the creepy one

Also if I have nudes on my phone it’s not like I’m going to hand it to ANYONE sure as heck not “the children” or a frickin or sensitive type
 
Last edited by a moderator:
So it’s scanning and “learning” from my personal photos local on my device? I’d suspect it’s reporting back to the “mothership” too?

Geee this never could be used in a very bad and very creepy way, never I tell you 😂



I thought google was the creepy one

Also if I have nudes on my phone it’s not like I’m going to hand it to ANYONE sure as heck not “the children” or a frickin or sensitive type
Well, you’d suspect wrong. It’s on device and doesn’t phone home, so your entire post is based on a false assertion.
 
Who truly benefits from this feature? Apple? or some other entity?
People who don’t want to see nudity in texts or in contact pictures, people who don’t want their kids to receive sexts. All of this image processing is done locally using the Neural Engine on your phone, and it’s opt-in. It makes far more sense as a feature than do the people who claim it doesn’t make sense.
 
Oh my sweet summer child :)
Effectively, you’re claiming that Apple is lying about how this detection is done. I’m no advocate of the NSA, but you’re engaging in conspiratorial thinking. (Besides, the NSA would probably pull a man-in-the-middle on your network connection or find an exploit and hold onto it instead of however you think it works.)
 
People who don’t want to see nudity in texts or in contact pictures, people who don’t want their kids to receive sexts. All of this image processing is done locally using the Neural Engine on your phone, and it’s opt-in. It makes far more sense as a feature than do the people who claim it doesn’t make sense.
I think there is a slight difference between "nudity" and "unsolicited sexually explicit content".

I hope I won't get banned for life, but I have to admit I was born nude. Unnatural, I know. Freaky.

I really would like to protect my child from all sorts of evil things happening in the big, bad Internet. Violence, verbal bullying and harassment are commonplace. Nudity (unless it falls into the category of unsolicited sexually explicit) does not make it to my list of fears. And I am less worried about them seeing some business-as-usual adult content than videos of people being decapitated or shot.

(Yes, I know. The attitudes towards nudity are a bit different in some European countries vs. the US. But believe or not, naked body ≠ sex.)
 
I wish Mail had this feature. Recently some porn spam mails with full nudity have made it past the filter, and because the images are part of the email (not remote), they are shown by default. Makes sorting mails in public spaces a bit awkward.

Couldn't care less about contact posters.
 
  • Haha
Reactions: tridley68
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.