Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The moment it comes to iPhone, I'm gone because I'm not a sheep.

For all the crap everyone on here gives Google, they would get slaughtered far worse if they pulled a stunt like this.

Apple pretending to give a damn when this is nothing more than an attempt to sell devices to parents so they think their kids will be safe with an iPhone and all while Saudi, Russia, China etc use it for hunting down people.

Funny thing is, this will force kids to use another messaging app to avoid being spied on and the reign of iMessage in the US will be over.

Everyone knows deep down this is a terrible idea, but they'd rather turn a blind eye so they can still use iMessage with some lazy moron who can't download another messaging app.

That is the other aspect; how many kids use iMessage? Using my Foster Resource classes and local school district (in Cali) as a source, Snapchat is the biggest option as it is not OS specific. Group chats seem to be the big thing these days.
 
Personally, I feel that this is a fairly non-controversial topic, but I also think that this is because it's a fairly straightforward feature with a clear intended audience and the benefit is pretty clear-cut. This feature is being pitched at parents who would appreciate more support in managing their children, and children below the age of 12 likely don't have the finances to purchase their own smartphones to begin with, and are typically using hand-me-downs from their parents. It's a pretty compelling argument to stay within the Apple ecosystem and for the entire family to keep using iPhones.

When I prepare a presentation, I typically start by putting myself in the shoes of the audience and asking myself "why am I here and what's in it for me?". Alerting parents to when children under the age of 12 open potentially explicit photos is something that, in my opinion, needs to become common practice across the industry. It’s a “why did it take companies this long to figure this out?” type of thing.

That said, I do think that Apple's PR has done a horrendous job this year, especially by bundling this iMessage feature together with their far more controversial CSAM-detection feature, and I do understand why nobody really has any reason to support the latter. Even if I have no child porn on my apple devices, and even if I have nothing to lose from having my photos being scanned, it still brings me back to my initial question - what's in it for me? At best, there is no downside, which is no way to sell a feature.

My take on the CSAM-detection measures proposed by Apple is that it would actually be far less invasive than what other companies like Facebook are currently doing (basically scanning every single photo you upload to their service, not just for pornography but basically data-mining it for everything it is worth). Nor is Apple combing through your entire photo library. Instead, what Apple is doing is comparing hashes of known child pornography against photos you would upload to iCloud. Apple is not looking through your vacation photos for images of your 5 year old daughter in a swimsuit. Apple still won't know what your photos contain, both before and after they are uploaded to iCloud. In short, Apple appears to have come up with a way to find CSAM images uploaded to one’s iCloud photos without actually dissecting and probing photo libraries in a way that voids privacy, assuming it works as advertised.

Human intervention is positioned as a type of fail-safe designed to catch the very few false positives (false allegations of someone possessing CSAM), rather than being the first line of defence.

As for concerns about governments forcing Apple to use this feature to spy on their citizens, my response is that given the numerous hoops they would have to jump through, there are likely already more practical and easier ways of going about doing so than trying to retrofit Apple’s CSAM detection system or getting Apple to create a different version of CSAM detection. So while not impossible, I am willing to go out on a limb that a lot of these "what if" scenarios are simply slippery slopes. The story of a vengeful ex stealing your iPhone and storing child porn on it just so it gets flagged sounds more like a plot out of a Hollywood movie.

I don't see it as an invasion of my privacy, I don't think there's much opportunity for abusing it, but it still doesn't answer the initial question - what's in it for me as the end user? What's the benefit to me if I am cleared of not possessing child porn on my iPhone, when I was never under any suspicion of having any to begin with? Any pedophile aware of said feature is simply just not going to use iCloud to store any such material, and it simply feels more like a way for Apple to get law enforcement off their backs than it is about protecting users (and protecting us from what exactly?).

I guess my point at the end of it all is that this to me is a discussion that deserves to be a lot more nuanced than being for or against privacy, especially when Apple's proposed implementation looks to be a lot less invasive than other companies.

Get the parents to get the kids onto iPhones and intertia will keep them brand loyal.
Marketing 101.
 
Last edited:
That is the other aspect; how many kids use iMessage? Using my Foster Resource classes and local school district (in Cali) as a source, Snapchat is the biggest option as it is not OS specific. Group chats seem to be the big thing these days.

It’s more that Apple can do this with iMessage because they control the app. They may make the API available to other chat services to adopt, but they won’t be able to force other services like Snapchat or WhatsApp to adopt it if the devs don’t want to.
 
  • Like
Reactions: dk001
Without completely giving away my age, I’ll just say that the iPhone was known when I got my first laptop.

The issue with authoritarianism, by the time it’s to the point where one can prove it’s happening, it’s too late. That’s why you’re seeing backlash to Apple’s child safety program. We can see where this will lead and are trying to stop it before it starts. But again, this has more to do with CSAM detection, not iMessage protection.

Of course responsible parents wouldn’t give the go ahead to use guns unattended. The point I was trying to make is that I was taught about guns and given the opportunity to safely shoot them with family members, and thus the desire to mess with them when parents weren’t around was gone. There wasn’t a shroud of mystery around them.

Children are naturally curious. They want to know. You either properly teach and guide them or deal with the consequences of their insatiable curiosity.
 
  • Like
Reactions: MrTSolar and ericwn
As a parent / grandparent, I should have the option to install features like this, not have them forced upon us. Apple’s take on right / wrong is not mine or any other parents take. We all differ. There may be some alignment. But; this should be an installable (app) feature, not an on/off OS feature - something you as a parent can decide whether to install and implement.

JMO YOMV.

Apple has never done custom iOS installations. I simply don't get the complaint. If you don't want to use this feature (or any other), then simply don't use it. Not sure how it affects your life more than any other feature in iOS that you don't use ??‍♂️
 
Apple has never done custom iOS installations. I simply don't get the complaint. If you don't want to use this feature (or any other), then simply don't use it. Not sure how it affects your life more than any other feature in iOS that you don't use ??‍♂️

It is IMO unneeded padding - a feature not required or wanted with an element of risk (low). Not exactly the same however this is like adding Notes or Pages into the OS. Implementing items like this and couching them under the guise of “Help the …” is more of a marketing or recruiting ploy than an actual benefit to current users. Why not just sell a “kids” iPhone?

If / When turned on, I would be interested in the actual number of users who implemented this ”feature”.

btw - from where did you get “custom” installations?
 
  • Like
Reactions: KindJamz
It is IMO unneeded padding - a feature not required or wanted with an element of risk (low). Not exactly the same however this is like adding Notes or Pages into the OS. Implementing items like this and couching them under the guise of “Help the …” is more of a marketing or recruiting ploy than an actual benefit to current users. Why not just sell a “kids” iPhone?

If / When turned on, I would be interested in the actual number of users who implemented this ”feature”.

btw - from where did you get “custom” installations?

I'm sorry, but I just don't understand your thinking here. There are plenty of iOS features that I (personally) don't need nor want, yet I don't give them a passing thought, even though they're there in the code. Who cares? And of course you and I are not the center of the universe, so we need to realize that even though we may not need/want certain features, thousands of other people do.

I got "custom installations" from your words: "I should have the option to install features like this". AFAIK Apple has never had custom iOS installations that let you check/uncheck which features you want installed.
 
Apple has never done custom iOS installations. I simply don't get the complaint. If you don't want to use this feature (or any other), then simply don't use it. Not sure how it affects your life more than any other feature in iOS that you don't use ??‍♂️
Kinda like side loading? Don’t do it if you don’t want to
 
  • Like
Reactions: dk001
Maybe Apple can do a conference teaching parents how to parent better instead of being like "F it, we'll do it ourselves"
Maybe apple should stay the hell out of “parenting”.

Well that’s what they do I guess. Some here are too simple to breath without being told. iAir.
 
  • Like
Reactions: dk001
Personally, I feel that this is a fairly non-controversial topic, but I also think that this is because it's a fairly straightforward feature with a clear intended audience and the benefit is pretty clear-cut. This feature is being pitched at parents who would appreciate more support in managing their children, and children below the age of 12 likely don't have the finances to purchase their own smartphones to begin with, and are typically using hand-me-downs from their parents. It's a pretty compelling argument to stay within the Apple ecosystem and for the entire family to keep using iPhones.

When I prepare a presentation, I typically start by putting myself in the shoes of the audience and asking myself "why am I here and what's in it for me?". Alerting parents to when children under the age of 12 open potentially explicit photos is something that, in my opinion, needs to become common practice across the industry. It’s a “why did it take companies this long to figure this out?” type of thing.

That said, I do think that Apple's PR has done a horrendous job this year, especially by bundling this iMessage feature together with their far more controversial CSAM-detection feature, and I do understand why nobody really has any reason to support the latter. Even if I have no child porn on my apple devices, and even if I have nothing to lose from having my photos being scanned, it still brings me back to my initial question - what's in it for me? At best, there is no downside, which is no way to sell a feature.

My take on the CSAM-detection measures proposed by Apple is that it would actually be far less invasive than what other companies like Facebook are currently doing (basically scanning every single photo you upload to their service, not just for pornography but basically data-mining it for everything it is worth). Nor is Apple combing through your entire photo library. Instead, what Apple is doing is comparing hashes of known child pornography against photos you would upload to iCloud. Apple is not looking through your vacation photos for images of your 5 year old daughter in a swimsuit. Apple still won't know what your photos contain, both before and after they are uploaded to iCloud. In short, Apple appears to have come up with a way to find CSAM images uploaded to one’s iCloud photos without actually dissecting and probing photo libraries in a way that voids privacy, assuming it works as advertised.

Human intervention is positioned as a type of fail-safe designed to catch the very few false positives (false allegations of someone possessing CSAM), rather than being the first line of defence.

As for concerns about governments forcing Apple to use this feature to spy on their citizens, my response is that given the numerous hoops they would have to jump through, there are likely already more practical and easier ways of going about doing so than trying to retrofit Apple’s CSAM detection system or getting Apple to create a different version of CSAM detection. So while not impossible, I am willing to go out on a limb that a lot of these "what if" scenarios are simply slippery slopes. The story of a vengeful ex stealing your iPhone and storing child porn on it just so it gets flagged sounds more like a plot out of a Hollywood movie.

I don't see it as an invasion of my privacy, I don't think there's much opportunity for abusing it, but it still doesn't answer the initial question - what's in it for me as the end user? What's the benefit to me if I am cleared of not possessing child porn on my iPhone, when I was never under any suspicion of having any to begin with? Any pedophile aware of said feature is simply just not going to use iCloud to store any such material, and it simply feels more like a way for Apple to get law enforcement off their backs than it is about protecting users (and protecting us from what exactly?).

I guess my point at the end of it all is that this to me is a discussion that deserves to be a lot more nuanced than being for or against privacy, especially when Apple's proposed implementation looks to be a lot less invasive than other companies.
Tldr
 
I'm sorry, but I just don't understand your thinking here. There are plenty of iOS features that I (personally) don't need nor want, yet I don't give them a passing thought, even though they're there in the code. Who cares? And of course you and I are not the center of the universe, so we need to realize that even though we may not need/want certain features, thousands of other people do.

I got "custom installations" from your words: "I should have the option to install features like this". AFAIK Apple has never had custom iOS installations that let you check/uncheck which features you want installed.

Maybe this will help in understanding: Does this have to be built into the OS directly? Especially a feature that I suspect will get little use. IMO


Still don’t get where you came up with “custom”. This would like purchasing an app that “adds functionality” the iMessage functionality. Layers it on top of.
 
Maybe this will help in understanding: Does this have to be built into the OS directly? Especially a feature that I suspect will get little use. IMO

I just completely disagree that it will get little use. I imagine TONS of parents will enable this feature.

Still don’t get where you came up with “custom”. This would like purchasing an app that “adds functionality” the iMessage functionality. Layers it on top of.

Aren't those all third-party apps? Apple doesn't make apps like that (AFAIK), so I'm not sure why they would/should start doing so now for this feature when they don't for any other. Would be very inconsistent and make little sense.
 
I just completely disagree that it will get little use. I imagine TONS of parents will enable this feature.



Aren't those all third-party apps? Apple doesn't make apps like that (AFAIK), so I'm not sure why they would/should start doing so now for this feature when they don't for any other. Would be very inconsistent and make little sense.

That is why I would be interested in the number of folks who actually use it.

How many actually use the Child accounts today?
 
That is why I would be interested in the number of folks who actually use it.

How many actually use the Child accounts today?

While I'm sure there are some who don't, you'd have to be pretty naive as a parent to let your child have an iPhone (or other smart phone) without enabling at least some parental control features (more or less depending on the particular child's age, etc.). So I imagine a large number do, thus the reason why Apple implements it as part of iOS. I still don't understand why you're bothered by features you personally disable/don't use being in the code of iOS. Just use your phone and don't give them a second thought.
 
While I'm sure there are some who don't, you'd have to be pretty naive as a parent to let your child have an iPhone (or other smart phone) without enabling at least some parental control features (more or less depending on the particular child's age, etc.). So I imagine a large number do, thus the reason why Apple implements it as part of iOS. I still don't understand why you're bothered by features you personally disable/don't use being in the code of iOS. Just use your phone and don't give them a second thought.

Yes and no. Most folks have no clue how to even set that up. Heck, most don't want to take the time. Most major Carriers here in the US want to push their solutions using their apps and services.

I went looking for numbers or percentage for iPhone child devices vs parental and have yet to find anything.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.