Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
69,006
40,020


Starting with iOS 18.2, children in Australia have a new option to report iMessages containing nude photos and videos to Apple, the company told The Guardian. Apple said it will review these reports and could take action, such as by disabling the sender's Apple Account and/or reporting the incident to law enforcement.

iOS-18-2-Communication-Safety-Report-Feature.jpg

The report outlined what these reports will include:
The device will prepare a report containing the images or videos, as well as messages sent immediately before and after the image or video. It will include the contact information from both accounts, and users can fill out a form describing what happened.
The feature comes after Australia introduced new rules that will require tech companies like Apple to take stronger measures to combat child sexual abuse material (CSAM) on their platforms by the end of 2024, according to the report.

Apple said it plans to make this feature available globally in the future, according to the report, but no timeframe was provided.

This is an extension of Apple's existing Communication Safety feature for iMessage, which launched in the U.S. with iOS 15.2 in 2021. With the release of iOS 17 last year, Apple expanded the feature worldwide and enabled it by default for children who are under the age of 13, signed in to their Apple Account, and part of a Family Sharing group.

Communication Safety is designed to warn children when they receive or send iMessages containing nudity, and Apple ensures that the feature relies entirely on on-device processing as a privacy protection. The feature also applies to AirDrop content, FaceTime video messages, and Contact Posters in the Phone app. Parents can turn off the feature on their child's device in the Settings app under Screen Time if they wish to.

The nudity reporting option comes after Apple in 2022 abandoned its controversial plans to detect known CSAM stored in iCloud Photos.

The first iOS 18.2 beta was released yesterday for devices with Apple Intelligence support, including iPhone 15 Pro models and all iPhone 16 models. The software update is expected to be widely released to the public in December.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Article Link: iOS 18.2 Lets Children Report Nudity in iMessages, Starting in Australia
 
  • Like
Reactions: KeithBN
Won't it be better for Apple to use Apple Intelligence to figure out the image, check for nudity, and block it?
Perhaps, but that's a complicated situation. Do we want to be able to identify criminals and take them off the streets, or prevent SOME of the crimes they commit while allowing them to continue criminal activity with others (e.g., those who don't have iPhones)? Should Apple be in the business of referring text messages directly to law enforcement without the victim being involved in the process? How much invasion of privacy is acceptable? What happens when a system starts interfering with legitimate messages (e.g., two consenting adults trading nudie pics with each other)?

These are just a few of the questions that have been raised in the past. I do agree that building in a reporting function is a good idea.
 
Perhaps, but that's a complicated situation. Do we want to be able to identify criminals and take them off the streets, or prevent SOME of the crimes they commit while allowing them to continue criminal activity with others (e.g., those who don't have iPhones)? Should Apple be in the business of referring text messages directly to law enforcement without the victim being involved in the process? How much invasion of privacy is acceptable? What happens when a system starts interfering with legitimate messages (e.g., two consenting adults trading nudie pics with each other)?
To your last point, with how this system works, one (or both) of the adults would be using child accounts and also have to *choose* to submit the report, so the way this system is working wouldn’t really impact that scenario, unless for some reason you’re using a child account with sensitive content protection enabled?

The worst thing that could be done is someone misleads someone into sending a nude into a child account they’ve set up, but I feel like that would be very difficult to prove intentional malice on the sender’s part.
 
To your last point, with how this system works, one (or both) of the adults would be using child accounts and also have to *choose* to submit the report, so the way this system is working wouldn’t really impact that scenario, unless for some reason you’re using a child account with sensitive content protection enabled?

The worst thing that could be done is someone misleads someone into sending a nude into a child account they’ve set up, but I feel like that would be very difficult to prove intentional malice on the sender’s part.
The question of intent can be tricky. If you send a nudie pic to someone that you THINK is a minor, it doesn't actually matter if they ARE a minor; your intent is what makes it a crime, in many cases. Sending a nudie pic to someone that you think is an adult, but who turns out to be a minor, is ALSO a crime. The only way it's not a crime is if both people are adults and the sender believes/knows the other person is an adult.

Of course the best way to avoid problems is not to send nudie pics, period. If you are going to, make damn sure the other person is who you think they are.
 
With Project2025 here in America we will be reported to law enforcement for being Gay if they suspect it ( the person who you talk to on iMessage ) if they happen to learn towards the Right. So if you send a picture to them of you kissing the same gender. It would cause the person to get arrested if Project2025 is passed here.
 
Nobody is a parent anymore. Just adults with pet children. Maybe stop giving your kids access to the internet before they're old enough?! Actually PARENT your children. The child should be telling their parents so they can go to the police. Why are we deferring that responsibility to a Tech Corporation!?
 
Nobody is a parent anymore. Just adults with pet children. Maybe stop giving your kids access to the internet before they're old enough?! Actually PARENT your children. The child should be telling their parents so they can go to the police. Why are we deferring that responsibility to a Tech Corporation!?
You think being a teenager isn’t old enough?
 
I don't see anything wrong with this. It's good to empower children to take action when being harassed like this.

Nobody is a parent anymore. Just adults with pet children. Maybe stop giving your kids access to the internet before they're old enough?! Actually PARENT your children.

You act like you were never irresponsible as a kid. Nobody thinks about consequences when they're young. Plus, it's impossible to shelter kids from the internet in 2024. If they don't use it at home, they'll use it at school.
 
If the wrong government gets in
You will be arrested just for sending a person a picture of you kissing the same gender. That’s a possibility here in America if the OrangeFelon wins.

That's predicated on Tim (a fellow homosexual) cooperating with a government mandate of that nature. Not to underplay the clear and present issues with a particular candidate, though.
 
Won't it be better for Apple to use Apple Intelligence to figure out the image, check for nudity, and block it?
While AI can certainly make moderation easier, it’s not always better. There are many documented cases where AI has struggled with moderation, either failing to flag inappropriate content or incorrectly flagging harmless posts. A hybrid approach, where AI flags content for human review, might be helpful, but it still carries the risk of the AI overlooking something obvious to a human as illegal.
 
  • Like
Reactions: Dj64Mk7
I want you to re-read my post and circle the word "Teenager" if you see it.
My mistake, I had the impression that this feature was for anyone under 18; in that case your post would imply you felt anyone in that group was too young to be online.

But after looking at it closely, I see that it’s only so for children under 13.
 
I hope it does not cause as many false positives as text messages "spam" which is just too easy to tap the wrong button haha the amount of time I reported a legit sms as spam by mistake ...

With Project2025 here in America we will be reported to law enforcement for being Gay if they suspect it ( the person who you talk to on iMessage ) if they happen to learn towards the Right. So if you send a picture to them of you kissing the same gender. It would cause the person to get arrested if Project2025 is passed here.
If you’re not receiving pictures of genitalia on a child-designated account the option doesn’t even appear, so both these scenarios seem very unrealistic.

Nobody is a parent anymore. Just adults with pet children. Maybe stop giving your kids access to the internet before they're old enough?! Actually PARENT your children. The child should be telling their parents so they can go to the police. Why are we deferring that responsibility to a Tech Corporation!?
Because often times when people of all ages (but especially children and teenagers) are in situations where they are being sexually harassed or abused by others, it can be very uncomfortable to discuss these topics with others. The idea that this exists because “people don’t want to parent” is really shortsighted.
 
Nobody is a parent anymore. Just adults with pet children. Maybe stop giving your kids access to the internet before they're old enough?! Actually PARENT your children. The child should be telling their parents so they can go to the police. Why are we deferring that responsibility to a Tech Corporation!?
You can’t shelter kids forever. They will eventually be out in the world interacting with other people. There is nothing wrong with providing people with more tools to help protect themselves. The child should tell their parents, but sometimes, kids feel that they can’t do that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.