Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What is the point of this? How exactly would Apple, or any other platform, know if a user was over a certain age? Should they implement the fool-proof birth date prompt? How else would they know for sure? The best way is for someone to make those decisions for the children — their parents.
 
Google started CSAM checks in 2013.

Why has Apple decided children only deserve to be protected from next month?
Because there was considerable policy debate on how to do this and not scan your emails, cloud files etc? It's great that google is scanning digital assets left, right and center as well as up and down. At least Apple seems to be striking a balance between privacy and protecting the children.
 
iCloud IS encrypted - it's just that Apple has the encryption keys.

Apple grants almost all requests for data. And that’s just what they admit to in their transparency reports. So Apple having the keys is as good as there being no keys.
 
It's always up to the parent to be a parent.

Hopefully, Apple can close loopholes that aren't societal and fix the process. They cannot change psychology.
 
Enabled by default...

So what? Tons of apps have "send diagnostica data" or other options the developer thinks are best on by default when you install them. Just turn the option off. We're all adults here and capable of setting up our iPhones the way we want. No one's holding a gun to our heads to keep iCloud photos on.
 
  • Disagree
Reactions: B4U
Apple grants almost all requests for data. And that’s just what they admit to in their transparency reports. So Apple having the keys is as good as there being no keys.

No, the point of the encryption is to keep hackers from your files. Apple has always been up front about the fact that they can access your files for purposes of investigation or request from other agencies (see iCloud legal agreement section V. E.). If people were under the impression that Apple couldn't access or hand over their account data or content, then they didn't due their due diligence to educate themselves if that was a concern for them. So if you have extremely sensitive data in iCloud, I'd suggest you store and back it up locally instead.
 
The watchdog group did not activate parental controls?? Rookie move. The headline should read:

Major Weakness in Watchdog’s Investigation of Apple‘s App Store Child Safety Measures.
 
  • Like
Reactions: ender78
If a 14 year old says “I’m 18” then yes Apple is not at fault for showing them adult apps.

If a 14 year old says “I’m 14” then Apple is absolutely at fault for showing them adult apps.

How this is even an argument is mind-blowing. The extent of the cult of Apple to condone the sexual abuse of minors is insane.

IMHO ratings should be advisory and the ultimate decision whether content can be accessible to a minor should belong exclusively to that minor's guardian. If a 14 years old says "I'm 14" and the guardian of said minor says "I'm fine with him/her accessing apps rated higher", Apple should not prevent that.

Guardians of minors have not only the right, but the duty to make these decisions and Apple gives them the tools to enforce whatever policy they want to apply to the minors they are responsible for.
 
Could not agree more Screen Time and Child Protection is a joke and buggy as hell, I have a 13 and 14-year-old and it's nearly impossible to control anything with it. 1/2 it does not work and does not sync between devices.
 
So what? Tons of apps have "send diagnostica data" or other options the developer thinks are best on by default when you install them. Just turn the option off. We're all adults here and capable of setting up our iPhones the way we want. No one's holding a gun to our heads to keep iCloud photos on.
Installing an app versus an OS level default setting are not the same. 😓
 
What part of spying do you not understand?
That is enabled by default.
turn off iCloud and it turns off CSAM detection. do it before iOS 15 is released and it'll stay off.
if you're a new user, don't sign up for iCloud and therefore it won't be enabled ever. this isn't hard.

jesus christ, i'm done talking with you. bye.
 
  • Disagree
Reactions: B4U
Is there any particular reason why Apple does not implement controls on AppStore purchases when an Apple ID holder explicitly declares his/her age as 14 and tries to purchase an app meant for 17+ years? What exactly is the point of asking for age when creating an Apple ID then? Why should parents or a guardian turn on some control to restrict these purchases? What steps does Apple take to ensure that 17- children do not download apps that are meant for 17+? Why does it take 30% cut when it cannot implement this basic thing that affects a child's safety but wants to install scanning mechanism on billions of phones to catch (really) 0.0001% (made up percentage) of pedophiles? Is it because it does not care who purchases the app as long as it gets its 30% cut?
 
Installing an app versus an OS level default setting are not the same. 😓

Who cares if it's something within an OS or an app? The principle is the same. And it's not even an OS-level default setting either, really. You still have to sign into iCloud before anything is turned on. It should be common sense to set that up how you want it for each device, and anyone who's concerned about this obviously will.
 
Did you know back then, the App Store rating system was different? You could have an app rated 12+ for "frequent/intense" realistic violence, which they revamped in 2014, which caused apps to that were rated 9+ to be 12+ and so on.

They revamped the ratings system three times. The second time was in 2019, where they made "frequent/intense" simulated gambling (this is the only category where frequent/intense is more common but neither made any difference) to be 17+ which originally it was 12+. Then they revamped it again this year but focused on gambling/contests.

They used to have a warning when you tried to download a 17+ app, but they deleted it (doesn't exist anymore) which unsurprisingly most of the 17+ apps were just web browsers with "unrestricted web access". I know how the App Store works so games being rated 9+ or 12+ isn't that surprising to me.
 
Parental controls are like the locks on your house: functional, but not flawless. They encourage honesty, but the determined few will quickly defeat the mechanism.

The parental controls are good for preventing accidental downloads or purchases, but a healthy relationship and consistent discipline are what drive desired behavior. As your kids mature, an excuse such as “because I said so” should transition into an age-appropriate explanation: ”the things you consume shape your mind and thoughts - I’m trying to help you set healthy patterns and understand that not every idea, image, or video is necessarily wholesome to your well being.” Besides a frustrated expression or eye-rolling, hopefully they are asking you “Why?“, which is your big opportunity to impart some wisdom :)
 
Apple has an age rating system and parents can block their kids from downloading apps based on the age rating. I’m not sure what else they can do. Should they just block apps based on the age on their Apple ID? I think parents should be acting like the parents here, not apple.
You're right! Apple blocking apps that Apple (or the law) says are for people at least a certain age using the age given to Apple is WAY TOO HIGH a bar for Apple. I mean how would you even program such a thing into the app store?
 
So this investigative group sets up an Apple ID, puts a credit card on it, doesn’t enable parental controls, and makes up an imaginary 14yo to create this shockingly stupid story. Its so funny but it’s also really sad and desperate. Probably the same morons behind CSAM scanning.
 
No they couldn’t. The CPU process and data access would be obvious. Now they have the excuse that any and all access to local files, and the broadcast of the results to an external server, is for CSAM.

Right... Apple, one of the most successful companies in the world, would have no idea, not even a clue how to implement that. Seriously? It would be pretty simple, just takes a modicum of imagination to do that undetected.

But the real question should be, what would Apple do with all that personal information/data/photos harvested from its 1+ billion active customers owning phones?
 
Use the damn parental controls on your kids iPhone. WTF is so hard about that?

Agreeing !!! As far as I can tell, if parental controls are activated all of this is bunk. This group simply expected that setting the age enabled all parental controls. Not under 18 don't have any kids so cannot validate. There was an expectation that age alone would enable the parental controls. Lots of good reasons for someone under 18 to be accessing content that is rated 17+ but not "adult content".
 
No they couldn’t. The CPU process and data access would be obvious. Now they have the excuse that any and all access to local files, and the broadcast of the results to an external server, is for CSAM.
Photos have been getting scanned on Apple devices for years to identify people. How can you tell that an additional hash is getting created? Not to mention that Apple built the CPU and NPU and could hide usage from any monitoring tools.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.