Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Rogifan

macrumors Penryn
Original poster
Nov 14, 2011
24,756
32,257
I have some photos from the beach. I’m trying to remove some people far in the background. On some of the photos it works perfectly but on others it’s all pixelated and I see a message at the bottom that a safety filter is applied. Maybe it’s an 18.1 bug but if not will there be a way to turn this off or does Apple get to decide what’s removable and what’s not?
 
  • Like
  • Wow
Reactions: twanj and g-7
Good question. Wish I had the answers because this is stupidly annoying that Apple feels they need to tell me what I need to erase vs. pixilate.
 
Super annoying. I'm trying to remove a finger from a picture and it's blurring it out instead of removing it. I assume it thinks the finger is a penis? Really dumb to blur it instead of removing it.
 
  • Haha
Reactions: Fred Zed and twanj
I have some photos from the beach. I’m trying to remove some people far in the background. On some of the photos it works perfectly but on others it’s all pixelated and I see a message at the bottom that a safety filter is applied. Maybe it’s an 18.1 bug but if not will there be a way to turn this off or does Apple get to decide what’s removable and what’s not?
I also have issues with beach pictures applying the safety filter… strange. I wonder what is causing it.
 
  • Wow
Reactions: twanj
It seems like anything with skin gets pixelated. I tried removing a tat from my chest and it pixelated. Same thing with one on my arm. It’s things like this that occasionally make me consider switching OS’s. Sammy and google don’t blur images.

Hopefully this will change as they make improvements
 
  • Wow
Reactions: twanj
I have some photos from the beach. I’m trying to remove some people far in the background. On some of the photos it works perfectly but on others it’s all pixelated and I see a message at the bottom that a safety filter is applied. Maybe it’s an 18.1 bug but if not will there be a way to turn this off or does Apple get to decide what’s removable and what’s not?
You must use the Pencil to paint the object, paint a wider area around the object, cover it solid. Then apply clean up to the object painted.
 
You must use the Pencil to paint the object, paint a wider area around the object, cover it solid. Then apply clean up to the object painted.
Doesn't work for me.

This is excessive nannying in my view. Clearly Apple Intelligence is scanning the photo for nudity and instead of just letting me edit the photo, has determined that it needs censoring. This is a big over-reach of censorship. I'm a grown adult and can decide for myself what is and isn't appropriate for me to look at.

I'm also very concerned at how the photos are being scanned - is it on device or in the cloud? And what is the determining factor for a "safety filter" being applied or not? Most of the photos I've tried it on where the filter was applied were not NSFW, so whatever algorithm it's using isn't very good at its job either.

I'm surprised there isn't a louder reaction to this from users. Is this Apple covertly going back on their previous decision not to scan all your photos in the cloud for CSM? Not only would that be a massive invasion of privacy, but they're also just clearly not very good at determining what is and isn't NSFW content, and are flagging things incorrectly.

The privacy issue is a big concern for me. What are they doing?
 
Doesn't work for me.

This is excessive nannying in my view. Clearly Apple Intelligence is scanning the photo for nudity and instead of just letting me edit the photo, has determined that it needs censoring. This is a big over-reach of censorship. I'm a grown adult and can decide for myself what is and isn't appropriate for me to look at.

I'm also very concerned at how the photos are being scanned - is it on device or in the cloud? And what is the determining factor for a "safety filter" being applied or not? Most of the photos I've tried it on where the filter was applied were not NSFW, so whatever algorithm it's using isn't very good at its job either.

I'm surprised there isn't a louder reaction to this from users. Is this Apple covertly going back on their previous decision not to scan all your photos in the cloud for CSM? Not only would that be a massive invasion of privacy, but they're also just clearly not very good at determining what is and isn't NSFW content, and are flagging things incorrectly.

The privacy issue is a big concern for me. What are they doing?
I’ve found that if you can find a way to cover the entire image (draw) or edit the photo entirely to just the part you want edited with a shape crop, the ai won’t pick it up, but you have to hide as much as you can in order for it to work. Yeah super annoying
 
  • Like
Reactions: jjdodders
I’ve found that if you can find a way to cover the entire image (draw) or edit the photo entirely to just the part you want edited with a shape crop, the ai won’t pick it up, but you have to hide as much as you can in order for it to work. Yeah super annoying
It's usually things in the photo like dirt on mirrors that I used to be able to blur out on the desktop app with the heal tool. That's the kind of thing I want to edit but that tool doesn't exist now and you can ONLY use the AI fix tool. I hate it.
 
  • Like
Reactions: UpsideDownEclair
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.