Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What I still haven’t been able to find anywhere are the details of what the human review process entails. It’s illegal for anyone other than NCMEC to have those images. So how does Apple review then? If an adult nude image is falsely flagged, will Apple assume it’s a true positive? Keep in mind some 25 year olds look 15. And some 15 year olds look like they are 25.

I think you are overthinking this.

The Apple representative will look at the images from the safety voucher and if at least one of them looks like CP as defined by federal law to them, it will be reported.

If Apple does this in good faith, they gain limited immunity from civil and criminal charges in any federal or state court.
 
  • Wow
Reactions: SFjohn
Take that up with your non-understanding parents. Also it’ll only flag to your parents if you received in iMessage. Do your research in safari and don’t save the pictures…

Did you just suggest a child should “take it up with their parents” over disapproval of homosexuality? How about support from Apple to the 13yo who wants to talk privately in iMessage to his first gay friend from school, isn’t out yet, and doesn’t want his parents to know? Maybe they’re romantically involved and as an alternative to actual sexual activity they want to flirt using photos or compare images of sexy pictures they found on the Internet? Are they actually abusing each other to the point that Apple should be notifying their parents? Is it possible this kind of scenario occurs far more often than say, when a female adult English teacher texts her boobs to a high school student?
 
  • Like
Reactions: SFjohn
Did you just suggest a child should “take it up with their parents” over disapproval of homosexuality? How about support from Apple to the 13yo who wants to talk privately in iMessage to his first gay friend from school, isn’t out yet, and doesn’t want his parents to know? Maybe they’re romantically involved and as an alternative to actual sexual activity they want to flirt using photos or compare images of sexy pictures they found on the Internet? Are they actually abusing each other to the point that Apple should be notifying their parents? Is it possible this kind of scenario occurs far more often than say, when a female adult English teacher texts her boobs to a high school student?
So, do you suggest that Apple invent an AI algorithm and scan iMessage content to distinguish the context between 12yos exchanging “innocent” sexually explicit materials and child predators grooming marginalized kids (like LGTQ youth in repressive families and communities)?
Or do you think Apple (or any technology provide) just shouldn’t offer any parental controls over the activity that pre-teens engage with online?

By the way, in Apple’s planned feature, kids have agency to decide if they want to view material that will notify their parents. Nonetheless, why do you presume that parents of pre-teens are not manually auditing and viewing their kids iMessage communications anyway?
 
I think you are overthinking this.

The Apple representative will look at the images from the safety voucher and if at least one of them looks like CP as defined by federal law to them, it will be reported.

If Apple does this in good faith, they gain limited immunity from civil and criminal charges in any federal or state court.
But if Apple cannot legally have the raw images of CSAM, WHAT are they comparing matched images to? If they have nothing to compare, then they need to make judgement calls. If its any form of nude image with a subject that might LOOK young, will they say its a positive match?

So you basically said if it LOOKS like CP. So if someone is 25 but looks 15, Apple will make a judgement call on that as in most areas 17 years and 11 months is still considered CP.
 
Last edited:
No, the free pass should extend to anyone. It’s the equivalent of sharing a Playboy magazine in the 1970s. And it’s a reality that sexting is part of 21st century dating whether we like it or not, and that starts at adolescence. Most of us were sexually active well before the age of consent.
Well I don't think kids should have been looking at Playboy magazines either.
 
Did you just suggest a child should “take it up with their parents” over disapproval of homosexuality? How about support from Apple to the 13yo who wants to talk privately in iMessage to his first gay friend from school, isn’t out yet, and doesn’t want his parents to know? Maybe they’re romantically involved and as an alternative to actual sexual activity they want to flirt using photos or compare images of sexy pictures they found on the Internet? Are they actually abusing each other to the point that Apple should be notifying their parents? Is it possible this kind of scenario occurs far more often than say, when a female adult English teacher texts her boobs to a high school student?
Yes, by Tim Cook, associates, and Apple cultist ofc, these who claims to have 10 iPads, 100000000 shares, 5 Mac Pro and thanks to Apple their sons went to colleague/university, they have a better life thanks to Apple, etc... oh and their Apple Watch saved their lives, no doctors included, it was the watch; that’s the society these days.
 
Except the Taliban have a terrible track record and Apple has (mostly in functioning democracies) an industry-leading track record of championing privacy?
Industry-leading yes but far from perfect and that's only because others are very bad. They've been caught doing bad things numerous times. If those were alarms, then what's happening now is a red flag and I can guarantee that Apple will never gain trust again.
 
Again? Wasn't this article already posted?

Anyway, it's pretty much misleading from the start since it's not a backdoor in any technical sense, worse-for-privacy cloud scanning is already taking place at least at other photo library providers, and "scan users' photo libraries" conveniently forgets to mention that it's pictures being uploaded to the cloud service.

Perhaps the signatories should read the relevant technical documents and FAQs:

Keep telling yourself that the man behind the curtain isn't real. It's not going to hurt me, nor anyone else. Even when said man gets to a point where he's literally standing there, curtain wide open, wearing nothing at all but his own birthday suit with the words YES I'M REAL YOU FOOL scrawled across his forehead--it's still not going to affect us.

It's only going to affect you.
 
As a member of the LGBTQ community, I am dismayed by opponents of the parental controls in iMessage because of a presumed disparate impact on homosexual youth. The parental notification of potentially explicit imagery views on kids devices are for only kids under 13yo. Parents of all kids, including LGBT kids, deserve to protect these young kids from potentially being targeted by child predators.

Further, LGBT kids in repressive households or communities are MORE susceptible from grooming by child predators. We shouldn't limit parental controls because there are some parents with unhealthy parenting skills. These parents could just take their kids devices and check all the messages regardless.

The point has always been that Apple has no mechanisms to check for abusive parents (or indeed partners) lying about the age of older individuals in order to enable the functionality.
 
So, do you suggest that Apple invent an AI algorithm and scan iMessage content to distinguish the context between 12yos exchanging “innocent” sexually explicit materials and child predators grooming marginalized kids (like LGTQ youth in repressive families and communities)?
Or do you think Apple (or any technology provide) just shouldn’t offer any parental controls over the activity that pre-teens engage with online?

By the way, in Apple’s planned feature, kids have agency to decide if they want to view material that will notify their parents. Nonetheless, why do you presume that parents of pre-teens are not manually auditing and viewing their kids iMessage communications anyway?

I suggest Apple not try to get into the parenting business. No scanning, no special flagging of any content at all. Hands off the personal usage data of anyone using an iPhone. Let law enforcement pursue legitimate leads from actual complaints and probable cause without relying on Apple to spy on their users. And I’m not talking about “pre-teens” I’m talking about adolescents and older. Parents can monitor iPhone usage by looking at the phone anytime. We don’t need Apple intervening. Predators that groom kids for exploitation do it in person, on Facebook, on SnapChat, in TikTok, or any of a number of apps. Let’s not pretend any of this stops by locking down iMessage. They will quickly bypass it as easily as switching over to green bubbles or web-based messaging.
 
  • Like
Reactions: 09872738
I suggest Apple not try to get into the parenting business. No scanning, no special flagging of any content at all. Hands off the personal usage data of anyone using an iPhone. Let law enforcement pursue legitimate leads from actual complaints and probable cause without relying on Apple to spy on their users. And I’m not talking about “pre-teens” I’m talking about adolescents and older. Parents can monitor iPhone usage by looking at the phone anytime. We don’t need Apple intervening. Predators that groom kids for exploitation do it in person, on Facebook, on SnapChat, in TikTok, or any of a number of apps. Let’s not pretend any of this stops by locking down iMessage. They will quickly bypass it as easily as switching over to green bubbles or web-based messaging.
Apple is not involved in the process of "parenting" kids with this feature. This tool is for parents to do just that... to parent. Apple's parental notification is activate by parents and only applies to kids under 13yo, so that's why I references "pre-teens." And why would you prevent a tool for parents to work with law enforcement to "pursue legitimate" instances of child abuse.

So would you be against the other platform vendors (Facebook, SnapChat, TikTok) from enabling parental controls to mitigate the potential exploitation and trafficking of kids? Or are you an absolutist that doesn't think any technology should be available that provides parental control since it won't stop ALL abuse? I am not being snarky. I am earnestly confused about what you believe is acceptable parental control.
 
  • Like
Reactions: DougieS
The point has always been that Apple has no mechanisms to check for abusive parents (or indeed partners) lying about the age of older individuals in order to enable the functionality.
With that logic, there should be no parental control features on any software or service since it can be abused by those with sufficient access to another adult's device or accounts.
 
  • Love
Reactions: Stunning_Sense4712
I suggest Apple not try to get into the parenting business. No scanning, no special flagging of any content at all. Hands off the personal usage data of anyone using an iPhone. Let law enforcement pursue legitimate leads from actual complaints and probable cause without relying on Apple to spy on their users. And I’m not talking about “pre-teens” I’m talking about adolescents and older. Parents can monitor iPhone usage by looking at the phone anytime. We don’t need Apple intervening. Predators that groom kids for exploitation do it in person, on Facebook, on SnapChat, in TikTok, or any of a number of apps. Let’s not pretend any of this stops by locking down iMessage. They will quickly bypass it as easily as switching over to green bubbles or web-based messaging.
ChromeAce parental control options already exist and Apple is just expanding the feature set to parents who want to control/monitor what their kids do with the Apple devices. I wish Apple would add the ability for parents to block hotspot usage on their devices since kids use them to bypass parental controls at home. I understand some people have concerns on how parents will use this, but as long as the parents are not violating the law they have the right to raise their children how they want even if you disagree with how they are doing it.
 
Apple is not involved in the process of "parenting" kids with this feature. This tool is for parents to do just that... to parent. Apple's parental notification is activate by parents and only applies to kids under 13yo, so that's why I references "pre-teens." And why would you prevent a tool for parents to work with law enforcement to "pursue legitimate" instances of child abuse.

So would you be against the other platform vendors (Facebook, SnapChat, TikTok) from enabling parental controls to mitigate the potential exploitation and trafficking of kids? Or are you an absolutist that doesn't think any technology should be available that provides parental control since it won't stop ALL abuse? I am not being snarky. I am earnestly confused about what you believe is acceptable parental control.

I am not against parental controls of an iPhone. I am against a) putting ALL iPhone users under suspicion by scanning ALL iCloud photos for hash matches and b) flagging nudity to a child as a potential abuse situation. The latter is the job and morality of the parent to decide. It should be off by default, and opt-in under Parental Controls. Attempting to prevent child abuse goes too far when it is infringes on the privacy and objectivity of those not a party to it. We live in a free society, not designed to be locked down and surveilled in a pre-crime state. Children deserve to grow up in a world free from constant fear and supervision, to be themselves and not look at the world as an obstacle course of evil.

The fact is, a child is far more likely to be abused by a family member than a stranger online. Yet would you advocate surveillance of digital interactions between parents and children to spot potential red flags for abuse?
 
Last edited:
Turn off iCloud Photo and nothing is shared off the device.

I guess the question is, what is a better method of preventing the distribution of child abuse images on online services? Mass scanning of all images in the cloud, with opaque algorithms by Google, Microsoft, and Facebook? Currently, Apple identifies far fewer of this harmful and illegal content than other cloud providers.
Contradictions: For a start mass scanning of all images in the cloud is actually a stretching of the situation. Because MORE hardware scans will take place than scanning in the cloud, as its logical that there is far more material that some do not use the cloud? A billion plus devices all with software to scan, irrespective of whether its a hash only, represent more surveillance than assuming that every picture on every device ends up on the cloud?

"Apple identifies fewer of this harmful and illegal content than other cloud suppliers"

Evidence please? or could it be that Apple identifies fewer as it intends to use our hardware rather than the cloud, so it figures if they'e not been scanning via the cloud, they would hardly be up with Google etc. in identifying such material via the cloud.

In any event the focus point isn't the nature of the scanning, it is the infestation of 1,000,000,000+ users private hardware with software tools that can easily be modified, giving a backdoor to over 1,000,000,000+ individual users, even allowing targeted modification and where because the software is on your systems it bypasses System Integrity Protection so modification is comparatively easy and where unlike a lot of functions on Apple which have users in mind to provide software often with choice to the user as to switch on or off, this software is not designed to assist the user at all, the user who pays for their hardware.
 
As a member of the LGBTQ community, I am dismayed by opponents of the parental controls in iMessage because of a presumed disparate impact on homosexual youth. The parental notification of potentially explicit imagery views on kids devices are for only kids under 13yo. Parents of all kids, including LGBT kids, deserve to protect these young kids from potentially being targeted by child predators.

Further, LGBT kids in repressive households or communities are MORE susceptible from grooming by child predators. We shouldn't limit parental controls because there are some parents with unhealthy parenting skills. These parents could just take their kids devices and check all the messages regardless.
A good point, but it should also be noted that you have hit the nail on the head re: Child abuse/child pornography and the initial points of contact which is NOT child pornography. It usually starts with grooming, and that is usually via Social Media, as the initial point of contact. iMessage contact would be after initial contact and usually is. That's why concentrating on photos for example is locking the door after the horse has bolted, irrespective of whether a child is LGBT or not, and personally I don't like categorisation at all just as I don't really like lots of different terms for different forms of the same thing: Discrimination. We should have gone past that, inasmuch as we should refer just to children, just as adult box categories have often outlived their usefulness. We are all in a race, the human race and the sooner the world gets it the better.
 
It’s checking a hash when a file is uploaded to Apple’s servers. It is illegal to hold abusive images on Apple’s servers. It’s the law.
If it's about CSAM on Apple's servers, *Apple should scan people's iCloud libraries.*
Again : Apple hasn't been doing that if you look at the number of cases they've reported to the NCMEC, and they're not planning to scan people's iCloud libraries.
People who have CSAM on iCloud are safe. Apple will only scan NEW uploads.
 
Was a little perturbed to see elsewhere a personal reference to Tim, as a rat.

Not true you know. I've never particularly thought Tim was the same ilk as Steve, as Tim in my opinion is marketing orientated, with a lot of gizmos and added glitz to Apple whereas I've always preferred usability and functional prowess.

He has though got his pedigree having a degree in Industrial Engineering but went on for his Masters in Business Adminstration, which is where in my opinion he's followed that path. Now Steve didn't but he in my opinion is still the star that makes Apple even possible today. He dropped out, but was always interested in computers and both he and Woz were in the Homebrew Computer Club....a dedicated bunch of what some would call nerds, where creativity and innovation flourished.

You often find though that the creative geniuses are eventually superseded by those with other skills, and Apple must be happy with Tim, as otherwise he wouldn't be at the helm, so deserves not to be personally abused, verbally or otherwise.

Worth a look in at the Computer History Museum although it does seem to skate over the very start of computing its still worth a look.

Both different in their approach, but its no good anyone personal attacking Tim. The situation is in my opinion a bad mistake, whether through leverage from outside of Apple or not, but name calling really doesn't do him or anyone else justice.

But on performance you have to give Tim the credit for business performance, and name calling really doesn't assist when he's been at the head of Apple since 2011 and you don't hold that position for ten years unless you are doing something right, albeit sometimes where I'm not that keen on the direction, but that's to be expected.

All companies make mistakes, all people make mistakes and in my opinion this idea of infecting hardware which you can try to skate around it, but is what it is, is a bad idea.


 
Our research project began two years ago, as an experimental system to identify CSAM in end-to-end-encrypted online services. As security researchers, we know the value of end-to-end encryption, which protects data from third-party access. But we’re also horrified that CSAM is proliferating on encrypted platforms. And we worry online services are reluctant to use encryption without additional tools to combat CSAM.

We were so disturbed that we took a step we hadn’t seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides. We’d planned to discuss paths forward at an academic conference this month.

That dialogue never happened. The week before our presentation, Apple announced it would deploy its nearly identical system on iCloud Photos, which exists on more than 1.5 billion devices.

Published yesterday on the link bellow, well worth a read!

https://www.washingtonpost.com/opin...-abuse-encryption-security-privacy-dangerous/
 
I am not against parental controls of an iPhone. I am against a) putting ALL iPhone users under suspicion by scanning ALL iCloud photos for hash matches and b) flagging nudity to a child as a potential abuse situation. The latter is the job and morality of the parent to decide. It should be off by default, and opt-in under Parental Controls. Attempting to prevent child abuse goes too far when it is infringes on the privacy and objectivity of those not a party to it. We live in a free society, not designed to be locked down and surveilled in a pre-crime state. Children deserve to grow up in a world free from constant fear and supervision, to be themselves and not look at the world as an obstacle course of evil.

The fact is, a child is far more likely to be abused by a family member than a stranger online. Yet would you advocate surveillance of digital interactions between parents and children to spot potential red flags for abuse?
Perhaps you are unaware, but the iMessage feature is indeed off l, by default, and opt-in by parents who have configured their child devices. This is a tool for parents to parent and not any mass surveillance by Apple. Apple won’t even know of any of the content of iMessage since it is on-device AI in encrypted iMessage content. Where is the invasion of privacy? Do you think children, under the age of 13yo, have an expectation of privacy on their iPhones from their parents?

It is not presented as a tool to prevent all abuse but rather another parental control feature to mitigate the horror of child abuse and the distribution of illegal content.
 
  • Like
Reactions: DougieS
Expect the class action law suits to start flying if allegations are proved correct that Apple implemented this without notification and where it is alleged it was noticed hence the public announcement on it.

Oh Apple, you have really screwed up over this.


Worth learning to use Console or similar depending what device(s) you own.
 
Last edited:
  • Wow
  • Like
Reactions: katbel and Schismz
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.