Doesn't work for me.
This is excessive nannying in my view. Clearly Apple Intelligence is scanning the photo for nudity and instead of just letting me edit the photo, has determined that it needs censoring. This is a big over-reach of censorship. I'm a grown adult and can decide for myself what is and isn't appropriate for me to look at.
I'm also very concerned at how the photos are being scanned - is it on device or in the cloud? And what is the determining factor for a "safety filter" being applied or not? Most of the photos I've tried it on where the filter was applied were not NSFW, so whatever algorithm it's using isn't very good at its job either.
I'm surprised there isn't a louder reaction to this from users. Is this Apple covertly going back on their previous decision not to scan all your photos in the cloud for CSM? Not only would that be a massive invasion of privacy, but they're also just clearly not very good at determining what is and isn't NSFW content, and are flagging things incorrectly.
The privacy issue is a big concern for me. What are they doing?