Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
How about letting ADULTS report nudity in imessages?
I could see that easily descending into reverse revenge porn, reporting someone after a breakup or other spat for pictures that were exchanged consensually to get their account shut down, or at least suspended pending a review. With minor accounts it's cut and dry, but between adults it requires knowledge of context that Apple likely doesn't want to wade into.
 
Nobody is a parent anymore. Just adults with pet children. Maybe stop giving your kids access to the internet before they're old enough?! Actually PARENT your children. The child should be telling their parents so they can go to the police. Why are we deferring that responsibility to a Tech Corporation!?

Blanket phrases like "nobody is a parent anymore" are neither helpful, nor accurate.

I worked in pediatric healthcare for a few years and dealt with children in bad stations that had very involved parents that were very on-top of things. Kids get embarrassed, scared, threatened and mired in a multitude of other situations that make them not want to talk to their guardians, regardless of how involved or close they are. I think this a great feature!
 
Last edited:
Good move by Apple! I also think a way more thoughtful one than simply diving into all user photos to scan for CSAM. This obviously leads to issues of massive privacy intrusion for a minimal amount of actually harmful content. This respects privacy so much better but is still highly useful.
 
  • Like
Reactions: ProbablyDylan
No nudes, people. Adult or otherwise.

I just don't get it. I never had any desire to send a picture of my junk to another person. If the other person would be interested in getting a picture like that then I'm not interested in being around that person.

Different um, whatever for different folks, but I never understood this.
 
With Project2025 here in America we will be reported to law enforcement for being Gay if they suspect it ( the person who you talk to on iMessage ) if they happen to learn towards the Right. So if you send a picture to them of you kissing the same gender. It would cause the person to get arrested if Project2025 is passed here.
You currently can’t even protest in college campuses.

So..not sure we need Project 2025 to already be in Project (insert name)
 
  • Like
Reactions: UpsideDownEclair
Perhaps, but that's a complicated situation. Do we want to be able to identify criminals and take them off the streets, or prevent SOME of the crimes they commit while allowing them to continue criminal activity with others (e.g., those who don't have iPhones)? Should Apple be in the business of referring text messages directly to law enforcement without the victim being involved in the process? How much invasion of privacy is acceptable? What happens when a system starts interfering with legitimate messages (e.g., two consenting adults trading nudie pics with each other)?

These are just a few of the questions that have been raised in the past. I do agree that building in a reporting function is a good idea.
I did initially think the way your first paragraph is but how is this any different in the end where you take screenshots of ANY messaging app even the most private ones and report that to the cops? And in my work we've talked about restricting screenshots in work (unrelated to CSAM), but in the end, you can't prevent anyone from taking a photo of the screen.

So the reporting is interesting but I do think we need to ensure it isn't abused--could you "SWAT" someone you don't like?
 
With Project2025 here in America we will be reported to law enforcement for being Gay if they suspect it ( the person who you talk to on iMessage ) if they happen to learn towards the Right. So if you send a picture to them of you kissing the same gender. It would cause the person to get arrested if Project2025 is passed here.
Somewhere in there is a career in comedy
 
I worked in pediatric healthcare for a few years and dealt with children in bad stations that had very involved parents that were very on-top of things. Kids get embarrassed, scared, threatened and mired in a multitude of other situations that make them not want to talk to their guardians, regardless of how involved or close they are.

I'm pretty sure this is going to lead to the cops showing up asking to talk to their parents... This is certainly not a low-drama unembarassing solution to this problem if that's what kids are after. It can certainly be leveraged into a high-drama exercise if that's what they want to have happen though.

There are times bypassing parents is appropriate-- usually if parents are part of the problem. That's typically handled with school resources and numbers you can call for help-- I don't see where this adds anything beyond being a number to call. Handing things like this to a foreign multi-national tech company that will apply global policies that almost certainly involve local governments is a coarse solution to what you're suggesting is a delicate problem.

And giving this kids, who by nature have an underdeveloped sense of consequences, brings it's own problems along with any potential solutions. I fully expect we're going to see this weaponized specifically because of the drama it can create. SWATing is a thing that actual grown adults do, and they should certainly know better. Kids looking for leverage in the lunchroom?
 
  • Like
Reactions: ipaqrat
Good move by Apple! I also think a way more thoughtful one than simply diving into all user photos to scan for CSAM. This obviously leads to issues of massive privacy intrusion for a minimal amount of actually harmful content. This respects privacy so much better but is still highly useful.

This addresses a completely different problem than the CSAM scanning was meant to, and the CSAM scanning wouldn't have operatee anything like you're suggesting it would.

One thing this alert isn't is respectful of privacy, but then the law seems to make that impossible:
1729795289466.png
 
I'm pretty sure this is going to lead to the cops showing up asking to talk to their parents... This is certainly not a low-drama unembarassing solution to this problem if that's what kids are after. It can certainly be leveraged into a high-drama exercise if that's what they want to have happen though.

There are times bypassing parents is appropriate-- usually if parents are part of the problem. That's typically handled with school resources and numbers you can call for help-- I don't see where this adds anything beyond being a number to call. Handing things like this to a foreign multi-national tech company that will apply global policies that almost certainly involve local governments is a coarse solution to what you're suggesting is a delicate problem.

And giving this kids, who by nature have an underdeveloped sense of consequences, brings it's own problems along with any potential solutions. I fully expect we're going to see this weaponized specifically because of the drama it can create. SWATing is a thing that actual grown adults do, and they should certainly know better. Kids looking for leverage in the lunchroom?

As I understand, this is reviewed by a real person at apple, with context, before being transferred to authorities. Not everything reported is going straight to authorities.
 
Why would you allow your child to be communicating with someone they don’t know, block all contacts except the ones that the parents specify and lock down the phone.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.