Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
MacRumors had an article how that doesn't really work as intended, here let me give you the link:
Oh, wait, it's the article you're commenting, whoops.
Oh, wait, I just tested it and if you do this...
Bildschirmfoto 2021-08-25 um 19.44.32.png

...you can't even download an app with a higher age rating. 😲
 
  • Like
Reactions: bousozoku
The problem with age restriction systems is that it needs the complete cooperation of adults to make the system work. It is when adults don't give a damn and allow children to have access to their information is when age restriction systems fail and there is nothing that can be done about it.

If a child is able to get access to adults personal information which is used to create a user account then there is not an age restriction system in the world that will prevent a child from accessing age restricted material.
 
  • Like
Reactions: deevey
This "report" does not even mention the word Parent or Parental Controls! It would appear that there was an expectation that just setting the age of the user, automatically enabled the parental controls.

With parental controls, a parent could decide that they child 13 years and 364 days old could download music with explicit ratings and download "violent games" that are rated 14 or 17+.
 
Sorry, they actually started hash checks in 2008.

My question still remains.

So that’s 13 years of Apple refusing to participate in checking for CSAM because they didn’t want to pay the price themselves.

Now they can pass on the cost they’re acting like they really care.

Huh? So who's been paying all the Apple employees and those of other companies to develop and implement this CSAM detection technology for iOS 15?
 
My question still remains.



Huh? So who's been paying all the Apple employees and those of other companies to develop and implement this CSAM detection technology for iOS 15?

Apple is ONLY interested in CSAM technology which also allows them to scan a user’s personal files for arbitrary signatures.

If they really only cared about checking iCloud photos for known CSAM then they could have implemented that at any point in the last 13 years. But they didn’t.
 
  • Like
  • Disagree
Reactions: giggles and VulchR
Apple is ONLY interested in CSAM technology which also allows them to scan a user’s personal files for arbitrary signatures.

If they really only cared about checking iCloud photos for known CSAM then they could have implemented that at any point in the last 13 years. But they didn’t.

And if Apple were really interested in snagging user personal files, they could have secretly implemented that years ago.
 
So, to summarise: Apple is planning on doing invasive surveillance on your iPhone on the basis of their (1) technical competence, (2) their concern for children, and (3) trust regarding their motives and moral resolve, yet they can't manage to keep adult-rated apps, from which they make profits, out of kids' hands who are declared to be 14-year-olds?

The vetting of apps in the App store has been an neglected aspect of Apple's operations for years, yet they chose to develop spyware (IMO) integral to iOS using AI. Well, how about applying AI to sort out app vetting instead??? Apple is sinking lower in my estimation daily.
 
Apple is ONLY interested in CSAM technology which also allows them to scan a user’s personal files for arbitrary signatures.

If they really only cared about checking iCloud photos for known CSAM then they could have implemented that at any point in the last 13 years. But they didn’t.

You have no idea what you're talking about. That's not at all what's going on with Apple's CSAM detection. The only files being scanned are photos the user is syncing with iCloud and the "signatures" are not arbitrary, but a database of known CSAM hashes. Additionally, since Apple has no access to your device, they ONLY know about images uploaded to iCloud (and then only detected CSAM . . . and THEN only if there's at least 30 of those detected images). They have already told you precisely what is happening. If you're calling them liars, then present your evidence to prove that.

And again, your logic is whack. You're basically saying that unless someone does everything right from day 1, then their motives are wrong if they start doing the right thing in the future. First of all, that defies all rules of logic. Secondly, my understanding is that one of the main reasons Apple has resisted scanning for CSAM on the cloud previously was precisely because they put such a high value on privacy - now they've come up with a way to maintain that committment to privacy while also still being able to detect illegal imagery being uploaded to their servers in violation of both their policies and federal law. If you think there's more to it then that, then don't just throw wild theories out there - present some hard evidence.
 
Maybe they did. This is a thought I wouldn't have entertained a month ago. How times change.

LOL! Maybe they were also running a secret child sex slave trade. I mean it's pOsSiBlE, right? Again, present your evidence. If you don't have any, then I'd suggest keeping your wild theories to yourself, as each verbalized one makes you less and less credible to people who value rationality over conspiracy theories and fear-mongering.
 
...The only files being scanned are photos the user is syncing with iCloud and the "signatures" are not arbitrary, but a database of known CSAM hashes. ...
You do you understand that with trivial modifications the hashes could be of anything applied to any file , right?
 
You have no idea what you're talking about. That's not at all what's going on with Apple's CSAM detection. The only files being scanned are photos the user is syncing with iCloud and the "signatures" are not arbitrary, but a database of known CSAM hashes. Additionally, since Apple has no access to your device, they ONLY know about images uploaded to iCloud (and then only detected CSAM . . . and THEN only if there's at least 30 of those detected images). They have already told you precisely what is happening. If you're calling them liars, then present your evidence to prove that.

And again, your logic is whack. You're basically saying that unless someone does everything right from day 1, then their motives are wrong if they start doing the right thing in the future. First of all, that defies all rules of logic. Secondly, my understanding is that one of the main reasons Apple has resisted scanning for CSAM on the cloud previously was precisely because they put such a high value on privacy - now they've come up with a way to maintain that committment to privacy while also still being able to detect illegal imagery being uploaded to their servers in violation of both their policies and federal law. If you think there's more to it then that, then don't just throw wild theories out there - present some hard evidence.

Apple’s current implementation only checks known CSAM images. By changing 2-3 lines of code in the future they have access to your entire phone.

Please answer this: Apple could start checking this known CSAM database against iCloud images TODAY if they really cared. Or at any point in the last 13 years. But no. Why not?
 
  • Like
  • Disagree
Reactions: usagora and VulchR
So, to summarise: Apple is planning on doing invasive surveillance on your iPhone on the basis of their (1) technical competence, (2) their concern for children, and (3) trust regarding their motives and moral resolve, yet they can't manage to keep adult-rated apps, from which they make profits, out of kids' hands who are declared to be 14-year-olds?

The vetting of apps in the App store has been an neglected aspect of Apple's operations for years, yet they chose to develop spyware (IMO) integral to iOS using AI. Well, how about applying AI to sort out app vetting instead??? Apple is sinking lower in my estimation daily.
Use the damn parental controls on your kids iPhone. WTF is so hard about that?
 
I find this somewhat confusing:

"If Apple already knows that a user is under 18, how can it let the user download adult apps in the first place?"

It seems it should be a relatively straightforward thing: Apps and content should be rated as to the minimum age appropriate for them, and the App Store or App Store app compares the user's claimed age against that number.

How flippin' hard can it be?

(N.B.: This does not speak to the appropriateness of Apple being a substitute for parental responsibility or whether Apple should be doing any of this at all, but merely the technical challenge.)

To others: Can we please not turn this into Yet Another CSAM Thread?
 
  • Like
Reactions: 5105973 and VulchR
LOL! Maybe they were also running a secret child sex slave trade. I mean it's pOsSiBlE, right? Again, present your evidence. If you don't have any, then I'd suggest keeping your wild theories to yourself, as each verbalized one makes you less and less credible to people who value rationality over conspiracy theories and fear-mongering.
Satire. *cough*
 
Use the damn parental controls on your kids iPhone. WTF is so hard about that?
Does Apple need parental controls to avoid selling a 18+ rated app to a self-declared 14-year-old? I would have thought they would understand computing functions related to basic inequalities.
 
And if Apple were really interested in snagging user personal files, they could have secretly implemented that years ago.

No they couldn’t. The CPU process and data access would be obvious. Now they have the excuse that any and all access to local files, and the broadcast of the results to an external server, is for CSAM.
 
  • Like
  • Disagree
Reactions: giggles and VulchR
Does Apple need parental controls to avoid selling a 18+ rated app to a self-declared 14-year-old? I would have thought they would understand computing functions related to basic inequalities.

Exactly. Apple are the guy on “To Catch A Predator” where you say “I’m actually 14” and Apple replies “I know and I still want to show you sex apps”.

There is absolutely no excusing this behaviour and anyone still defending this is sub-human and/or an Apple shareholder.
 
  • Like
Reactions: VulchR
Does Apple need parental controls to avoid selling a 18+ rated app to a self-declared 14-year-old? I would have thought they would understand computing functions related to basic inequalities.
Use...parental...control...on...your...kid's...iPhone.
 
  • Disagree
Reactions: VulchR
Use...parental...control...on...your...kid's...iPhone.

Or… Apple… could… pay… attention… to… the… date… of… birth… they… ask… for…

Imagine a pedo in court: “yes your honour the child told me they were 14 and I raped them, but their parents didn’t specifically ask me not to so blame them!”
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.