Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
BTW every time you buy food do you explicitly state you don’t want poison? Otherwise the business has the freedom to poison you right? I’m sure you never abdicate your demand for non-poison food.

By the same token, not every item that you could put in your mouth needs to be tabled as "do not consume". Paper napkins do not need to be labeled as poisonous nor should there be a notice that says do not eat.
 
I agree with you, however, many parents aren’t acting like parents, sadly.
Typical parent response: “All the friends of my daughter (11yo) have WhatsApp, Instagram and FaceBook, how am I supposed to forbid her something all her friends are using?”

Here is a short answer; Yes. If you so choose.

If all the other kids jumped off a bridge ...
 
  • Like
Reactions: Populus
Apple’s current implementation only checks known CSAM images. By changing 2-3 lines of code in the future they have access to your entire phone.

Please answer this: Apple could start checking this known CSAM database against iCloud images TODAY if they really cared. Or at any point in the last 13 years. But no. Why not?
Of course only Apple knows the real reason but I think they waited until the devices were capable of doing it on device so that they can continue to claim they don’t scan iCloud user data. It is similar to moving Siri to on device processing to avoid sending audio clips to their servers.
 
Are you deliberately misunderstanding the issue?

If a 14 year old says “I’m 18” then yes Apple is not at fault for showing them adult apps.

If a 14 year old says “I’m 14” then Apple is absolutely at fault for showing them adult apps.

How this is even an argument is mind-blowing. The extent of the cult of Apple to condone the sexual abuse of minors is insane.
Did you ever see a movie "Rated R" before you turned 17/18. A movie theatre may check ID of those under 17 buying a ticket, put a Parent walking into the movie with a 6 year old is making the decision to expose that child to the material.

The same parent can enable parental control on their phone. Once enabled, all of the measures that people are complaining about are moot.

Apple enables feature by default - HOW DARE YOU !
Apple disables a feature by default - HOW DARE YOU !

These features have NOTHING to do with CSAM.
 
Is there any particular reason why Apple does not implement controls on AppStore purchases when an Apple ID holder explicitly declares his/her age as 14 and tries to purchase an app meant for 17+ years? What exactly is the point of asking for age when creating an Apple ID then? Why should parents or a guardian turn on some control to restrict these purchases? What steps does Apple take to ensure that 17- children do not download apps that are meant for 17+? Why does it take 30% cut when it cannot implement this basic thing that affects a child's safety but wants to install scanning mechanism on billions of phones to catch (really) 0.0001% (made up percentage) of pedophiles? Is it because it does not care who purchases the app as long as it gets its 30% cut?


If you had a 16 year old would you be OK with them listening to music with Explicit Lirics.


Article Above is Apples Support document for Parental Controls.

Apple has chosen to make the features non default, lots of options and granularity. Parent decides which features the want to enable, simple as that, not Apple.

The world has gone nuts with Apple making recent changes that are intended to protect children. They may have made the decision to go the other way in the past.
 



For its investigation, the watchdog group did not activate parental controls on the hypothetical 14-year-old user's iPhone.

Article Link: Watchdog Investigation Finds 'Major Weaknesses' in Apple's App Store Child Safety Measures

Not activating parental controls is one huge caveat. In some cultures, being a 14 year-old is considered adult-equivalent (e.g. the bar/bat-mitzvah) and eligible to be trialed under law for his/her own actions. Thus for parents who do not consider their 14-year old to be "young adult" then they would need to activate parental controls for their non-adult children.

With this in mind, some other cultures do not consider full adulthood until the age of 30 (a will and testament lawyer once told me) and thus parental controls should be extended to this age.

On the other end of the spectrum, people with very advanced age tend to experience diminished mental abilities. In other words, these seniors are more prone to cyber crimes or at least dis-information campaigns. Therefore Apple should also provide a facility similar to parental controls for adult children to supervise their parents' devices.
 
  • Like
Reactions: MowgliWolf
Bad watchdog group simulates bad parenting by not turning on parental controls! 🍸🍸😹😹 >>”
For its investigation, the watchdog group did not activate parental controls on the hypothetical 14-year-old user's iPhone.”<< might want to lead the story with this fact!💡
 
  • Haha
Reactions: adib
You're right! Apple blocking apps that Apple (or the law) says are for people at least a certain age using the age given to Apple is WAY TOO HIGH a bar for Apple. I mean how would you even program such a thing into the app store?
You’re right, my bad. While we’re add it, let’s add facial recognition to television sets to make sure the kiddos aren’t watching mature content. For the kids. Let’s take as much responsibility away from the parents as possible.
 
Again, what’s the cost to the consumer?

By doing the scan on your phone, Apple is using your phone‘s battery and processor to do the work that Google and Microsoft etc. carry out on their servers. Every image that goes to iCloud is now a drain on your battery and a reduction in its life. I think that is the cost that folk are referring to.

Apple runs iCloud storage from Amazon and Google servers, which costs them millions per month.



The problem could be that running services to scan for CSAM on those servers would increase this cost, and Apple is not keen on paying. It would also require that Apple sends the file in an unencrypted format to the servers for scanning.

The other problem is that Apple totally failed to read the room. People don’t really care that their photos are unencrypted. But they do care about their personal devices spying and reporting them without warning then first. Folk care that Apple is assuming they’re guilty of something abhorrent then showing their photos to perfect strangers.

Whodathunkit?

… none of which has anything to do with this article.
 
Last edited:
turn off iCloud and it turns off CSAM detection. do it before iOS 15 is released and it'll stay off.
if you're a new user, don't sign up for iCloud and therefore it won't be enabled ever. this isn't hard.

jesus christ, i'm done talking with you. bye.

I understand your frustration! Some people on this forum will simply refuse to be satisfied with whatever Apple does. If it's not this, they'll find something else to make an issue out of. If everything's "hunky-dory" then they get bored, I guess - nothing to rant and rave about.

One clarification, though. New users can use iCloud for many other things than photos. All they need to do is turn off photos under iCloud settings and continue to use all the other features.
 
turn off iCloud and it turns off CSAM detection. do it before iOS 15 is released and it'll stay off.
if you're a new user, don't sign up for iCloud and therefore it won't be enabled ever. this isn't hard.

jesus christ, i'm done talking with you. bye.

This would only work as an argument if Apple had introduced the scanning at the same time they’d introduced iCloud photo storage. Telling folk to turn it off now means their customers have the burden of moving possibly thousands of pictures to a different service.

… and still nothing to do with the article.
 
This would only work as an argument if Apple had introduced the scanning at the same time they’d introduced iCloud photo storage.
No.

Telling folk to turn it off now means their customers have the burden of moving possibly thousands of pictures to a different service.

That would be unrelated to the issue of "enabled by default". Instead that would be related to the issue of "migration". That's a separate issue. Most customers who have signed up for 1) an iCloud account and 2) paid for iCloud storage would have used iCloud photos regardless of "enabled by default" setup.

You're trying so hard to make my argument to apply to a different issue. Makes no sense.

… and still nothing to do with the article.

Guess who bought up "spiPhone"? Hint: not me. So why are you telling me this?

Next time, when you jump in the middle of a conversation, read the entire conversation.
 
No.



That would be unrelated to the issue of "enabled by default". Instead that would be related to the issue of "migration". That's a separate issue. Most customers who have signed up for 1) an iCloud account and 2) paid for iCloud storage would have used iCloud photos regardless of "enabled by default" setup.

You're trying so hard to make my argument to apply to a different issue. Makes no sense.



Guess who bought up "spiPhone"? Hint: not me. So why are you telling me this?

Next time, when you jump in the middle of a conversation, read the entire conversation.

Read the whole thread, and was merely commenting that it really isn’t about the CSAM scanning.

You chose to reply to an off-topic reply:

turn off iCloud and it turns off CSAM detection. do it before iOS 15 is released and it'll stay off.
if you're a new user, don't sign up for iCloud and therefore it won't be enabled ever. this isn't hard.

And I chose to reply to your reply.

Next time, when you jump in the middle of a conversation, read the entire conversation.

Next time, when you jump on the internet, don‘t throw a hissy-fit when someone puts you right.

Or throw it anyway, if it helps.
 
Last edited:
was merely commenting that it really isn’t about the CSAM scanning.

That's a stretch. Re-read this sentence you wrote: "Telling folk to turn it off now means their customers have the burden of moving possibly thousands of pictures to a different service". That is literally continuing the discussion in relation to CSAM scanning and not "merely commenting" that it's offtopic.

And I chose to reply to your reply.

I believe that would fall under, in your own words: "nothing to do with the article." considering you continued the CSAM discussion.

Next time, when you jump on the internet, don‘t throw a hissy-fit when someone puts you right.

Or throw it anyway, if it helps.

Interesting how you decided to stop commenting about the enabled-by-default issue. Either

1) You realized you shouldn't have continued discussing the CSAM issue after accusing me of going off topic

or

2) You realized you are wrong about the "enabled by default" issue and are now trying to "win" this argument by making it more about being off-topic

Either case, there's nothing more of substance to discuss as mods are almost likely going to clean these messages up so any further discussion is pointless. I'm going to end it here with you. Have a good one.
 
What i find idiotic and ridiculous is that in many countries around the world the legal age for having sex is 14 to 16. In many countries the legal age for getting married is again 14 to 16 (some countries require parents consent) but yet these same people are not allowed to watch people having sex because it's considered pornography and thus they must be of legal age (18 or above). So a 14-16 year old can legally have sex but they are not allowed to watch people having sex until they are 18!!!.

It therefore makes the situation very complex when you have to devise controls to prevent minors from accessing adult related material because each country has different rules and law governing 'legal age'. That is why companies try and find a system that can work for all and Apple did this by introducing 'parental controls'. The system is there and it is not Apple's fault if adults/parents do not use the system. It is very disingenuous of charities and childrens groups to have a go at Apple saying they are not doing enough to prevent minors from accessing adult related material when these groups should be focusing their attention on the real problem. the adults and parents for allowing minors to get access to adult related material.

If I was to give one of my under 16 of age relatives my personal details so they could open a bank account to do adult stuff with, that would be considered fraud, a minor pretending to be an adult but yet when it comes to companies that provide age restriction services, they are the ones who get blamed if a minor pretends to be an adult and creates an account using information that garnered from an adult. It is still fraud but yet charities and childrens group always conviently avoid mentioning this.

If a company uses an industry accepted age restriction system and a minor bypasses it using the credentials of an adult then the minor and the adult who's credentials the minor used should be arrested. Why should companies who use well established age restriction systems be penalised due to actions of minor's who's intention is to cheat the system?
 
  • Like
Reactions: adib and SFjohn
Seems like the Campaign for Accountability is forgetting to hold parents accountable for ensuring their kids aren't downloading dating apps that contain pornographic material. In fact, in order to do that, a child needs to know the password for the Apple ID, so there are multiple opportunities for a parent to be responsible for preventing their child from accessing this material. Parents passing the buck to tech companies is lazy as hell. If you're campaigning for accountability, start with yourself.
 
  • Like
Reactions: adib and SFjohn
It works as intended. Age and content restrictions only works if someone enable parental control.

What they did was setup an iPhone without any parental control enabled and checked if they could download apps for 17+. They can.

Working as designed.
Except, why does Apple ask for your age when creating an AppleID when it's not going to enforce it?
One would think that creating an AppleID for a kid and their iPad wouldn't require you to setup parental controls.
 
Technology can always only do so much. At some point Parents need to... Parent. If you can't do that, maybe you shouldn't have kids.
Agreed, but it’s not without precedent that a merchant is expected to verify the age of the customer before proceeding with a transaction involving substances that can not legally be sold to minors. I’m not sure why a tech company with Apple’s considerable resources can’t devise a way to avoid selling certain apps to an iCloud ID belonging to a minor. They’ve certainly performed a recent impressive flex in the name of protecting children even if it makes all adult customers feel the privacy of their individual devices are being compromised. If it’s for the children then it must be done against the “shrieking voices”, so get on with it, Apple/s.
 
Last edited:
You do understand that if Apple wanted to screw you, they always could have done so in the past or at any time in the future, right? Same with any tech company. If you're this paranoid, then you really need to move off the grid, preferably underground to avoid satellite and drone surveillance by all the people out to get you.
I am concerned with what Apple proposes to do now. They have marketed an image of safeguarding privacy. Now they are violating that. Moreover, they just given ta blueprint for 'surveillance with privacy' to every authoritarian government on the planet. Frankly you seem to be a little naive and complacent. Suffice it to say that I remember the days of Hoover at the FBI, and various members of my family and friends worked for the DoJ, FBI, Pentagon, and CIA. If surveillance can be abused, it will be abused. Apple has taken the first step down this road.

And need I remind that Apple is doing this because for the first time in their history they have the computing power on mobile devices to start doing survillience based on local AI. That is new. They could not have done this before.
 
  • Disagree
Reactions: usagora
Sure! It's fun making stuff up. The sky is no longer the limit!
My point is that there was nothing preventing them from doing that. We as users trusted them with private information because their corporate ethos seemed to be focused on user privacy. Now they intend to engage in blanket surveillance, and the mere fact they have proposed to do this makes me reevaluate their motives.
 
I am concerned with what Apple proposes to do now. They have marketed an image of safeguarding privacy. Now they are violating that. Moreover, they just given ta blueprint for 'surveillance with privacy' to every authoritarian government on the planet. Frankly you seem to be a little naive and complacent. Suffice it to say that I remember the days of Hoover at the FBI, and various members of my family and friends worked for the DoJ, FBI, Pentagon, and CIA. If surveillance can be abused, it will be abused. Apple has taken the first step down this road.

And need I remind that Apple is doing this because for the first time in their history they have the computing power on mobile devices to start doing survillience based on local AI. That is new. They could not have done this before.

We both have the same set of facts available to us concerning what Apple is doing here. I'm taking them at face value. You're reading into them and committing the slippery slope fallacy. Since your argument is based on logical fallacy and you're persistent in it, there's no point in further discussion, as we have left the realm of rationality.

👋
 
My point is that there was nothing preventing them from doing that. We as users trusted them with private information because their corporate ethos seemed to be focused on user privacy. Now they intend to engage in blanket surveillance, and the mere fact they have proposed to do this makes me reevaluate their motives.

Vote with your wallet? Will you do it?
 
By doing the scan on your phone, Apple is using your phone‘s battery and processor to do the work that Google and Microsoft etc. carry out on their servers. Every image that goes to iCloud is now a drain on your battery and a reduction in its life. I think that is the cost that folk are referring to.

Apple runs iCloud storage from Amazon and Google servers, which costs them millions per month.



The problem could be that running services to scan for CSAM on those servers would increase this cost, and Apple is not keen on paying. It would also require that Apple sends the file in an unencrypted format to the servers for scanning.

The other problem is that Apple totally failed to read the room. People don’t really care that their photos are unencrypted. But they do care about their personal devices spying and reporting them without warning then first. Folk care that Apple is assuming they’re guilty of something abhorrent then showing their photos to perfect strangers.

Whodathunkit?

… none of which has anything to do with this article.
You’re reaching really hard to try to come up with a negative and you’re failing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.