Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Parents should be the first line of defense in safe guarding their offspring... Abdicate that and... well... let the wild west do it for ya.

So if a parent turns their back on their child for a second that’s them implicitly giving permission to abduct the child and rape them?

BTW every time you buy food do you explicitly state you don’t want poison? Otherwise the business has the freedom to poison you right? I’m sure you never abdicate your demand for non-poison food.
 
Or… Apple… could… pay… attention… to… the… date… of… birth… they… ask… for…

Imagine a pedo in court: “yes your honour the child told me they were 14 and I raped them, but their parents didn’t specifically ask me not to so blame them!”
It's not like a kid could enter any age. And what is happening then? Should Apple require to upload some ID, to verify your age? Bet you're the first yelling: "PRIVACY!"
 
And what is this proposed cost increase to the consumer?

Apparently enough that Apple has refused to do these checks for the last 13 years and will only start for iOS 15 users who pay the cost themselves.
 
The crux of this claim is that a user creating an account and choosing an age between 13 and 16 isn't prevented from downloading 17+ rated apps. Age restrictions on apps have only ever been utilized by Screen Time (née Restrictions). Prior to iOS 12, Screen Time (and Restrictions) had to be managed on the device itself or via an MDM solution. A responsible parent would enable Screen Time/Restrictions prior to handing a new device to their child, which eliminates this issue entirely.

That said, Apple absolutely has the ability to enforce these age restrictions based on the age of the account. Where this entire "experiment" breaks down is that someone creating an Apple ID can put in whatever birthday they want, claiming to be 18 and bypassing any account-based restriction entirely. For this reason, aaount-based restriction is the wrong solution. Instead, Apple should do a better job of promoting Screen Time to parents, especially when iTunes Family is enabled and child and teen accounts are joined to said family.

As always, the responsibility of protecting one's children falls on parents, not everyone else. Apple may as well enable account-based restriction anyway, as they can then wash their hands of responsibility (until some other group begins complaining that Apple doesn't verify age when users create accounts).
 
It's not like a kid could enter any age. And what is happening then? Should Apple require to upload some ID, to verify your age? Bet you're the first yelling: "PRIVACY!"

Are you deliberately misunderstanding the issue?

If a 14 year old says “I’m 18” then yes Apple is not at fault for showing them adult apps.

If a 14 year old says “I’m 14” then Apple is absolutely at fault for showing them adult apps.

How this is even an argument is mind-blowing. The extent of the cult of Apple to condone the sexual abuse of minors is insane.
 
Are you deliberately misunderstanding the issue?

If a 14 year old says “I’m 18” then yes Apple is not at fault for showing them adult apps.

If a 14 year old says “I’m 14” then Apple is absolutely at fault for showing them adult apps.

How this is even an argument is mind-blowing. The extent of the cult of Apple to condone the sexual abuse of minors is insane.
Apple is showing them adult apps? Why isn't Apple showing me adult apps? I have to specifically search for them. Also, if a kid is so honest to enter it's real age, why should it download apps clearly not suited for it? 😂
 
  • Disagree
  • Like
Reactions: dgrey and Khedron
Apple’s current implementation only checks known CSAM images. By changing 2-3 lines of code in the future they have access to your entire phone.

sigh...they don't even have access to your images NOW with iOS 15. ALL they would see would be information about detected CSAM once it's uploaded to iCloud. You clearly don't even understand the most basic aspect of how this works.

Please answer this: Apple could start checking this known CSAM database against iCloud images TODAY if they really cared. Or at any point in the last 13 years. But no. Why not?

Because they wanted to do it in a way that maximizes the user's privacy. That's the whole POINT of them implementing this detection method in iOS 15.
 
You do you understand that with trivial modifications the hashes could be of anything applied to any file , right?

You do understand that if Apple wanted to screw you, they always could have done so in the past or at any time in the future, right? Same with any tech company. If you're this paranoid, then you really need to move off the grid, preferably underground to avoid satellite and drone surveillance by all the people out to get you.
 
  • Like
Reactions: giggles
Satire. *cough*

I don't think you know what satire is, because your post I was replying to clearly wasn't satire. You were seriously entertaining the thought that Apple could have been stealing people's personal files on their iPhones.
 
  • Like
Reactions: giggles
sigh...they don't even have access to your images NOW with iOS 15. ALL they would see would be information about detected CSAM once it's uploaded to iCloud. You clearly don't even understand the most basic aspect of how this works.



Because they wanted to do it in a way that maximizes the user's privacy. That's the whole POINT of them implementing this detection method in iOS 15.

Well done for parroting the Apple propaganda. But how is it more private to scan files locally in iOS15 that are uploaded to iCloud than it would have been to scan iCloud files at any point in the last 13 years? Especially since in places like China Apple openly gifts the government access to files?

iCloud is not encrypted people. Never has been, and so long as China has power never will be. Wake up.
 
MacRumors had an article how that doesn't really work as intended, here let me give you the link:
Oh, wait, it's the article you're commenting, whoops.

It works as intended. Age and content restrictions only works if someone enable parental control.

What they did was setup an iPhone without any parental control enabled and checked if they could download apps for 17+. They can.

Working as designed.
 
  • Like
Reactions: giggles and dgrey
Are you deliberately misunderstanding the issue?

If a 14 year old says “I’m 18” then yes Apple is not at fault for showing them adult apps.

If a 14 year old says “I’m 14” then Apple is absolutely at fault for showing them adult apps.

How this is even an argument is mind-blowing. The extent of the cult of Apple to condone the sexual abuse of minors is insane.

No, it's fine that age restrictions has to be turned on. It should be opt-in and not enforced by Apple.
 
  • Like
Reactions: dgrey
No, it's fine that age restrictions has to be turned on. It should be opt-in and not enforced by Apple.

If Apple doesn’t enforce it then they shouldn’t ask in the first place. Otherwise it’s a pointless question that only serves to expose honest users to identity fraud through exposure of personal information.
 
  • Disagree
Reactions: giggles
Are you deliberately misunderstanding the issue?

If a 14 year old says “I’m 18” then yes Apple is not at fault for showing them adult apps.

If a 14 year old says “I’m 14” then Apple is absolutely at fault for showing them adult apps.

How this is even an argument is mind-blowing. The extent of the cult of Apple to condone the sexual abuse of minors is insane.

I’m so bloody sick of the children 🙄
 
  • Haha
Reactions: 5105973
The investigation concluded that Apple has created an ecosystem that is much more dangerous for minors than the company advertises. - Michelle Kuppersmith

Oh dear, oh my. Apple's ecosystem is just so dangerous for “minors” a.k.a. “children”. 🙄

I’ve got news for you, woman. Planet Earth, is dangerous for minors. Adults too!
 
Apple has an age rating system and parents can block their kids from downloading apps based on the age rating. I’m not sure what else they can do. Should they just block apps based on the age on their Apple ID? I think parents should be acting like the parents here, not apple.
I agree with you, however, many parents aren’t acting like parents, sadly.
Typical parent response: “All the friends of my daughter (11yo) have WhatsApp, Instagram and FaceBook, how am I supposed to forbid her something all her friends are using?”
 
For its investigation, the watchdog group did not activate parental controls on the hypothetical 14-year-old user's iPhone.

That's all that needed to be said. It should've been the lead in to the article. Sounds like it's acting exactly like it should. A better headline would be. Watchdog Group Finds Apple Respects Parental Control Settings or Lack Thereof

It's up to parents not Apple to decide what apps minors use. Apple did enough by setting age guidelines for apps. Then it's up to parents to decide if they want to use parental controls or let their kids run wild.

Parents have different styles. Mostly it's about teaching and setting a good example. If you're a parent to be proud of. The kid will want to make you proud of them. They'll just do stupid things sometimes and learn from their mistakes.

Anyways, I think parents delude themselves. They're kids. Even if you don't allow it. They'll have seen violence and nudity on TV or on a computer. If they want to see it. They'll just learn to hide it and that you aren't someone to talk to about it.

Even before the internet. There was always at least one friend with an older brother who had a stack of Playboys. There was also always a friend with HBO or Skinamax.
 
Well done for parroting the Apple propaganda.

What you call "parroting the Apple propaganda" I call stating the facts of how Apple has described the technical process of the detection (again, if you have evidence proving otherwise, then present it - you clearly don't). What you call "wake up" is actually an invitation for people to think with their emotions rather than reason.

But how is it more private to scan files locally in iOS15 that are uploaded to iCloud than it would have been to scan iCloud files at any point in the last 13 years? Especially since in places like China Apple openly gifts the government access to files?

Because as has already been explained to you, Apple can't see what's on your phone. It's only data about illegal images that come OFF your phone ONTO iCloud that they could potentially see. If the scan happens on iCloud, then they see it all. Yes, Apple COULD access any of your files on iCloud, but now they can scan without having to access all your files (including non-image files) on their servers. If you still have a problem with it, then don't use iCloud for photos. Everyone has a choice.

iCloud is not encrypted people. Never has been, and so long as China has power never will be. Wake up.

iCloud IS encrypted - it's just that Apple has the encryption keys.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.