Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

dialogos

macrumors 6502
Original poster
Sep 22, 2017
261
294
I have Sensitive Content Warning turned on because that's my preference. However, I keep getting warnings when exchanging AirDrop photos with my wife, which are just pictures of our kids!

This has happened three times this week, and it's incredibly annoying! Turning off the sensitive content warning is not a solution I want. It's outrageous that I keep receiving the same message from my wife—who lives in the same household, is under my family account, and is in my contacts. We exchange photos of our kids multiple times a day. There should be an option to remove sensitive content warnings from approved people.

I also wanted to send a photo to the grandparents, who are not tech-savvy but are in my contact list and under my family account. We FaceTime all the time. Why should they get this warning for photos of their grandchildren?

Someone suggested turning off the sensitive content warning. No, thanks, I don't want that. But I keep hearing that Apple only delivers products when they work perfectly. Absolutely not. This situation is just bad, and that's my personal opinion.
 

Reverend Benny

macrumors 6502a
Apr 28, 2017
867
649
Europe
A curious question from someone that hasn't tried this feature. Has the "photos-app" identified these photos as photos of your children before you send them?
 

dialogos

macrumors 6502
Original poster
Sep 22, 2017
261
294
A curious question from someone that hasn't tried this feature. Has the "photos-app" identified these photos as photos of your children before you send them?
Of course. I have a full album of my kids, with their faces recognized in various albums. One of the photos that was blocked yesterday showed my baby from the back at the beach, with her bottom showing. It’s a lovely photo taken from my wife’s phone at a location we frequently visit. It's unacceptable that this photo was blocked, and I don't care if Apple doesn’t recognize my child’s face. They should figure it out.

It's frustrating to have warned the grandparents so many times about scams and being cautious, only for them to receive a warning for a photo of their grandchild where the face is clearly visible. This warning doesn't just pop up; you have to go through a second page to accept it. And it doesn't save the sender as accepted or approved, so this will keep happening over and over.
 

Reverend Benny

macrumors 6502a
Apr 28, 2017
867
649
Europe
Of course. I have a full album of my kids, with their faces recognized in various albums. One of the photos that was blocked yesterday showed my baby from the back at the beach, with her bottom showing. It’s a lovely photo taken from my wife’s phone at a location we frequently visit. It's unacceptable that this photo was blocked, and I don't care if Apple doesn’t recognize my child’s face. They should figure it out.

It's frustrating to have warned the grandparents so many times about scams and being cautious, only for them to receive a warning for a photo of their grandchild where the face is clearly visible. This warning doesn't just pop up; you have to go through a second page to accept it. And it doesn't save the sender as accepted or approved, so this will keep happening over and over.
So what you want is this function. Get a warning when you send or receive nude photos but to be excluded for all your family members or selected contacts?
Or do you mean that Apple should still have this block for everyone but with some clever AI or similar recognize your children, even if the face aren't showing, and don't block those photos when sending them?
 

dialogos

macrumors 6502
Original poster
Sep 22, 2017
261
294
So what you want is this function, to get a warning when you send nude photos, to be excluded for all your family members or selected contacts?
Or do you mean that Apple should still have this block for everyone but with some clever AI or similar recognize your children, even if the face aren't showing, and don't block those photos when sending them?

Using the term "nude photos" for pictures of my own children is a bit too much. It's a baby 🙄

You have never taken a photo of a baby?

What I'm saying is that Apple should be able to recognize who belongs to my family - a person we share photos with of the same kids, same geographical locations, same everything, every day. They should let me put a person to an approved list. On top of that my wife and the grandparents are in my family group. Is this some clever AI?

And if you read properly my answer to you before, they are also blocking photos where the face is shown. This is exactly what I answered to you before " .... only for them to receive a warning for a photo of their grandchild where the face is clearly visible. "

Other parents have encountered this issue before.
 
Last edited:

Reverend Benny

macrumors 6502a
Apr 28, 2017
867
649
Europe
Using the term "nude photos" for pictures of my own children is a bit too much. It's a baby 🙄

You have never taken a photo of a baby?

What I'm saying is that Apple should be able to recognize who belongs to my family - a person we share photos with of the same kids, same geographical locations, same everything, every day. They should let me put a person to an approved list. On top of that my wife and the grandparents are in my family group. Is this what you consider clever AI?

And if you read properly, they are also blocking photos where the face is shown. This is exactly what I Also wrote before " .... only for them to receive a warning for a photo of their grandchild where the face is clearly visible. "

Other parents with babies have encountered this issue before.
I appreciate you edited your reply a number of times, the first ones were out of order.
As I haven't used this feature but do manage phones at work and a lot of times decide what functions to turn on or off by default I have an interest in knowing how things work.

When it comes to using the term "nude" its what the software recognizes, if its a baby, teen, adult or whatever, its discovered as not wearing any clothing or very little. AI can prob further tweak this so that what you suggest (location, device used to take photo, face and body recognition, clothing and who you are with etc etc) are being used to make this function even smarter.
 

dialogos

macrumors 6502
Original poster
Sep 22, 2017
261
294
I appreciate you edited your reply a number of times, the first ones were out of order.
As I haven't used this feature but do manage phones at work and a lot of times decide what functions to turn on or off by default I have an interest in knowing how things work.

When it comes to using the term "nude" its what the software recognizes, if its a baby, teen, adult or whatever, its discovered as not wearing any clothing or very little. AI can prob further tweak this so that what you suggest (location, device used to take photo, face and body recognition, clothing and who you are with etc etc) are being used to make this function even smarter.

Yes, I edited my post because I was upset with you calling my baby photos "nude." Such a generalization is inappropriate and I do consider you rude.

You mentioned you have never used this function, so why are you defending how Apple is working? I'm highlighting a clear problem here, and the solution is simple. You keep using terms like "clever AI" and "nude photos," but I'm just saying that people in the family group should be approved. It's as simple as that. Please try using it yourself to have a personal experience and opinion instead of analyzing something you have never experienced.

Additionally, you are ignoring the fact that Apple is blocking photos where the face is clearly visible—photos that are part of our family albums. This is the same company that once considered implementing CSAM detection...

PS. Final question ( and yes I'm editing my post to add this... I hope it's not out of order) .. Please advise me. How someone is supposed to take photos of his child in order to share them with his family? It's an honest question and I would like your answer please. Once again sorry for editing my post to add this part

PPS.
When an older person, like a grandparent, buys a new iOS device, the sensitive content warning is turned on by default. So, when they receive a large warning for a photo, you can see the problem here, right? These are people who have been advised countless times to be cautious about online scams.
 
Last edited:

kitKAC

macrumors 6502a
Feb 26, 2022
801
755
Use a shared Photo album or exclude AirDrop from being affected by the Sensitive Content system.
 

dialogos

macrumors 6502
Original poster
Sep 22, 2017
261
294
Why don’t you wind down your aggressive attitude and tone. Take it up with Apple, go and be as rude and aggressive to them and see where it gets you.

Otherwise, deal with the issue. It ain’t gonna change from anything you post on here.

Aggressive attitude? I was simply upset that someone generalized a baby photo as a nude photo.

Is there a reason people defend Apple so fervently? According to you, there isn't any problem?
 

dialogos

macrumors 6502
Original poster
Sep 22, 2017
261
294
Use a shared Photo album or exclude AirDrop from being affected by the Sensitive Content system.

I've used shared albums a lot, but they’re not very convenient. For example, there are times when we want to send a new photo via iMessage or AirDrop. Additionally, older people often struggle with using photo albums.

Yes, a photo album is one solution, but this problem could be easily avoided if family members could be added to a safe list. Don't you think so?
 

StumpyBloke

macrumors 603
Apr 21, 2012
5,454
6,126
England
Aggressive attitude? I was simply upset that someone generalized a baby photo as a nude photo.

Is there a reason people defend Apple so fervently? According to you, there isn't any problem?

Typical expected response. Because I call you out on your inappropriate behaviour, I’m instantly defending Apple. Fantastic logic. For the record, I think Apple’s coding skills are worse than that of a monkeys, their system services are a joke, I could go on.

However do what you want. And believe what you want. But I will say it again, report it to Apple. That is the only way you are going to get it resolved, if indeed they believe there is an issue.
 
  • Like
Reactions: JemTheWire

dialogos

macrumors 6502
Original poster
Sep 22, 2017
261
294
Typical expected response. Because I call you out on your inappropriate behaviour, I’m instantly defending Apple. Fantastic logic. For the record, I think Apple’s coding skills are worse than that of a monkeys, their system services are a joke, I could go on.

However do what you want. And believe what you want. But I will say it again, report it to Apple. That is the only way you are going to get it resolved, if indeed they believe there is an issue.

What did you expect me to say when you called me aggressive? Instead of addressing someone calling a baby photo "nude," you’re attacking me for being upset?

Yes, I assumed you were defending Apple because you ignored the term "nude photos," labeled me as aggressive for criticizing Apple, and didn’t address my actual problem. I apologize if I misjudged you.

When I spoke to Apple, they suggested deactivating the Sensitive Content Warning. Many parents have similar experiences to share and Apple is aware.
 

MajorFubar

macrumors 68020
Oct 27, 2021
2,129
3,750
Lancashire UK
Aggressive attitude? I was simply upset that someone generalized a baby photo as a nude photo.

Is there a reason people defend Apple so fervently? According to you, there isn't any problem?
That's because there isn't a problem. You don't want to turn off the 'sensitive content' warning but you also expect to be able to somehow flag that the photos are of your own kin and that they shouldn't be classed as potentially inappropriate. Can't you even see that if that were possible it would get misused by CP perverts?
 

kitKAC

macrumors 6502a
Feb 26, 2022
801
755
I've used shared albums a lot, but they’re not very convenient. For example, there are times when we want to send a new photo via iMessage or AirDrop. Additionally, older people often struggle with using photo albums.

Yes, a photo album is one solution, but this problem could be easily avoided if family members could be added to a safe list. Don't you think so?

Sure, and you should log a feature enhancement request for Apple for them to add that feature. IIRC, Sensitive Content Warning appeared in iOS 17 so maybe iOS 18 will already have it? Apple likes to iterate on things.
 

dialogos

macrumors 6502
Original poster
Sep 22, 2017
261
294
That's because there isn't a problem. You don't want to turn off the 'sensitive content' warning but you also expect to be able to somehow flag that the photos are of your own kin and that they shouldn't be classed as potentially inappropriate. Can't you even see that if that were possible it would get misused by CP perverts?

What I want is the ability to add people to a "safe list," just as I can block people.

I don’t see how my wife would misuse the photos she’s sending to me. 🙄

Additionally, this function is turned on by default, and it's not straightforward to deactivate it. You need to navigate through several steps: go to Settings, then Privacy, then Sensitive Content Warning, which guides you to Screen Time, and so on. How is an older person, like a grandparent in my family group, supposed to manage this?

Do you personally believe that Apple has implemented this function without any issues?
 

dialogos

macrumors 6502
Original poster
Sep 22, 2017
261
294
Sure, and you should log a feature enhancement request for Apple for them to add that feature. IIRC, Sensitive Content Warning appeared in iOS 17 so maybe iOS 18 will already have it? Apple likes to iterate on things.

Thank you very much for your response. I have already spoken with them twice over the phone. They didn’t mention anything about logging a feature enhancement request. They are aware of the issue and recommend turning off the feature, even on brand new devices if we don't like the feature.
 

kitKAC

macrumors 6502a
Feb 26, 2022
801
755
Thank you very much for your response. I have already spoken with them twice over the phone. They didn’t mention anything about logging a feature enhancement request. They are aware of the issue and recommend turning off the feature, even on brand new devices if we don't like the feature.

You can log your feedback here:

https://developer.apple.com/bug-reporting/

but maybe wait until iOS 18 is announced (in 2 weeks) and betas are available to see what changes they bring first.
 
  • Like
Reactions: dialogos

MajorFubar

macrumors 68020
Oct 27, 2021
2,129
3,750
Lancashire UK
What I want is the ability to add people to a "safe list," just as I can block people.

I don’t see how my wife would misuse the photos she’s sending to me. 🙄
That'll be because you're not a pedo.
I do get what you mean. I'm not intentionally defending Apple against the indefensible.
It's just I can also see why it is the way it is, unfortunately.
 
  • Like
Reactions: dialogos

dialogos

macrumors 6502
Original poster
Sep 22, 2017
261
294
That'll be because you're not a pedo.
I do get what you mean. I'm not intentionally defending Apple against the indefensible.
It's just I can also see why it is the way it is, unfortunately.

Thank you for your reply.
I raised a critique of Apple for flagging my baby's photos as inappropriate when shared multiple times between family members. In the responses, I've encountered terminology such as:
  • Nude photos
  • Asking for clever AI
  • Aggressive
  • Perverts
  • Pedos
  • Pedos (twice)
I want to clarify that I do not imply you spoke badly against me; on the contrary, your message was very polite. I am merely pointing out how people react nowadays to something as simple with such pompous language.

PS. Once again, your message was very polite and to the point. I personally believe that our society is reaching a point where there is a significant group of people who feel a strong social need to defend societal norms without fully considering the problems their actions may cause.
 

ifxf

macrumors 6502
Jun 7, 2011
491
763
Today tossing AI as a solution to problems facing an app. Just about five years ago block chains was going to be a solution all issues. Five years from now tech evangelists will be promoting a new hot capabilty to problems.
 

MajorFubar

macrumors 68020
Oct 27, 2021
2,129
3,750
Lancashire UK
Thank you for your reply.
I raised a critique of Apple for flagging my baby's photos as inappropriate when shared multiple times between family members. In the responses, I've encountered terminology such as:
  • Nude photos
  • Asking for clever AI
  • Aggressive
  • Perverts
  • Pedos
  • Pedos (twice)
I want to clarify that I do not imply you spoke badly against me; on the contrary, your message was very polite. I am merely pointing out how people react nowadays to something as simple with such pompous language.

PS. Once again, your message was very polite and to the point. I personally believe that our society is reaching a point where there is a significant group of people who feel a strong social need to defend societal norms without fully considering the problems their actions may cause.
Unfortunately I don't think there is a one-fits-all solution to your problem. And I am definitely sympathetic you your plight. Unfortunately the existence of a minority of unsavoury people often means that overarching 'draconian' rules have to be put in place that impact all of us.
 

jb310

macrumors 6502
Aug 24, 2017
264
602
It might be easier to e-mail the photos to people instead of using AirDrop or iMessage. I don't think you'll get any kind of sensitive content warning there.
 

dialogos

macrumors 6502
Original poster
Sep 22, 2017
261
294
Unfortunately I don't think there is a one-fits-all solution to your problem. And I am definitely sympathetic you your plight. Unfortunately the existence of a minority of unsavoury people often means that overarching 'draconian' rules have to be put in place that impact all of us.

Once again, thank you for the message. I am not an advocate of draconian rules. However, in this case, Apple's implementation is flawed. For example, one of the photos that was recently blocked depicts my baby taking a shower. Here are the relevant details:

  1. The photo was sent by my wife.
  2. She is part of my family account.
  3. She is in my contacts.
  4. She sends me photos every day.
  5. The child was already recognized in the Photos app as my child, so the face recognition had already worked.
  6. The photo was taken at home, with the GPS location tagged as our home.
  7. I had recently accepted another photo with a content warning from my wife, so the system should have recognized this pattern.
How many more flags does Apple need to understand that this is safe content? Especially given that their own system had already recognized the face as my child.

My point is that if Apple can't properly deliver this system, maybe they shouldn't implement it at all. At the very least, they should let me have some control over it.

They don't offer the option to add family members to a safe list, but they do offer the option to completely turn off the Sensitive Content Warning system for everyone. Does that make sense? How are they protecting those who want to protect if after all people can just turn this system totally off?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.