Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's sad that we live in a world that needs this kind of thing, but we do. Good for Apple for implementing this. I see absolutely ZERO downside to such parental controls.

EDIT: Ok, for the "disagrees" that are now coming in: If your, say, 10 year old son or daughter was being sent pornographic images, are you telling me you wouldn't want to know about that? Please in the name of all that is decent explain to me how this is a bad thing?
If someone access your phones and put those images onto your account, tell me how you're gonna explain that they are not "yours"?
 
  • Like
Reactions: Victor Mortimer
In an earlier poll, 70% of forum goers voted they they were not against the proposed CSAM measures, and the people here are already considered the more vocal critics when it comes to anything Apple.

Just something to think about. The public may not be as opposed to such a move as we may otherwise have been led to believe by the online furore.
People no longer value their privacy as they once did. Then again, people didn't used to worship corporations like as some now do Apple Computer.
 

Attachments

  • The Church of Steve Jobs.png
    The Church of Steve Jobs.png
    447.4 KB · Views: 65
People no longer value their privacy as they once did. Then again, people didn't used to worship corporations like as some now do Apple Computer.
Personally, I feel that this is a fairly non-controversial topic, but I also think that this is because it's a fairly straightforward feature with a clear intended audience and the benefit is pretty clear-cut. This feature is being pitched at parents who would appreciate more support in managing their children, and children below the age of 12 likely don't have the finances to purchase their own smartphones to begin with, and are typically using hand-me-downs from their parents. It's a pretty compelling argument to stay within the Apple ecosystem and for the entire family to keep using iPhones.

When I prepare a presentation, I typically start by putting myself in the shoes of the audience and asking myself "why am I here and what's in it for me?". Alerting parents to when children under the age of 12 open potentially explicit photos is something that, in my opinion, needs to become common practice across the industry. It’s a “why did it take companies this long to figure this out?” type of thing.

That said, I do think that Apple's PR has done a horrendous job this year, especially by bundling this iMessage feature together with their far more controversial CSAM-detection feature, and I do understand why nobody really has any reason to support the latter. Even if I have no child porn on my apple devices, and even if I have nothing to lose from having my photos being scanned, it still brings me back to my initial question - what's in it for me? At best, there is no downside, which is no way to sell a feature.

My take on the CSAM-detection measures proposed by Apple is that it would actually be far less invasive than what other companies like Facebook are currently doing (basically scanning every single photo you upload to their service, not just for pornography but basically data-mining it for everything it is worth). Nor is Apple combing through your entire photo library. Instead, what Apple is doing is comparing hashes of known child pornography against photos you would upload to iCloud. Apple is not looking through your vacation photos for images of your 5 year old daughter in a swimsuit. Apple still won't know what your photos contain, both before and after they are uploaded to iCloud. In short, Apple appears to have come up with a way to find CSAM images uploaded to one’s iCloud photos without actually dissecting and probing photo libraries in a way that voids privacy, assuming it works as advertised.

Human intervention is positioned as a type of fail-safe designed to catch the very few false positives (false allegations of someone possessing CSAM), rather than being the first line of defence.

As for concerns about governments forcing Apple to use this feature to spy on their citizens, my response is that given the numerous hoops they would have to jump through, there are likely already more practical and easier ways of going about doing so than trying to retrofit Apple’s CSAM detection system or getting Apple to create a different version of CSAM detection. So while not impossible, I am willing to go out on a limb that a lot of these "what if" scenarios are simply slippery slopes. The story of a vengeful ex stealing your iPhone and storing child porn on it just so it gets flagged sounds more like a plot out of a Hollywood movie.

I don't see it as an invasion of my privacy, I don't think there's much opportunity for abusing it, but it still doesn't answer the initial question - what's in it for me as the end user? What's the benefit to me if I am cleared of not possessing child porn on my iPhone, when I was never under any suspicion of having any to begin with? Any pedophile aware of said feature is simply just not going to use iCloud to store any such material, and it simply feels more like a way for Apple to get law enforcement off their backs than it is about protecting users (and protecting us from what exactly?).

I guess my point at the end of it all is that this to me is a discussion that deserves to be a lot more nuanced than being for or against privacy, especially when Apple's proposed implementation looks to be a lot less invasive than other companies.
 
It's like now that iPhones have dedicated neural engine cores in the CPU, Apple is trying to find good reasons to use them.

About the Communication Safety Feature for Kids : The real feature here is teaching children that it's OK for their digital devices to spy on them and snitch on them.

About the CSAM detection mechanism : Who the hell would upload CSAM to their iCloud account? It is easy to see the slippery slope here. After CSAM, it will detect "terrorist propaganda". After "terrorist propaganda", it will be "antisocial content". "Sensitive political material". And it won't be just your iCloud account, that wouldn't be efficient would it?

This is for your safety, citizen! Whether that will take 5 or 10 or 20 years, this is the first step. Apple was the only company trying to fight the good fight here, but it's over. They have flipped. They are helping build a nightmare, totalitarian surveillance society. And paving the way with good intentions.

Please do not answer if you are going to reply "but the government does it too!". Well, clearly they weren't taking it this far if Apple needs to implement this for our "safety". If you accept this nightmare panopticon, don't drag other people down with you. Understand we don't all want to live in an open air concentration camp.

After using and loving Apple products for 30 years, I will be phasing out Apple and go full Linux. The Pixel with a custom ROM could work too. Not such great UX, but anything is good to avoid this hell.
 
This is so tiring. I'm about thiiiiiis close to de-digitalizing my mobile life entirely. These kinds of things are always pitched as "it's to protect the children" or some other righteous cause to get people onboard. I don't have kids, so I don't really have a horse in that race but people need to see this for what it probably is: a portal to mass, out in the open, surveillance, social credit tracking, etc. I'm fine going back to 2005-era tech if that's the case. While I love tech in general and the convenience it often provides, so little of this stuff is actually necessary for day-to-day life.
 
CSAM is coming.

And just like Apple must have worked on CSAM for a long time without telling anybody, they might be working on a whole lot of other things too. And all those things are coming to our devices, like it or not.
If you and gmail account or use Facebook, you've already been exposed to CSAM.
 
  • Like
Reactions: ericwn
If someone access your phones and put those images onto your account, tell me how you're gonna explain that they are not "yours"?

What are you talking about? This is a parental control concerning images being sent/received through the Messages app, not images that are simply on your device/account.
 
Last edited:
I find it both sad and hilarious that people are going to technologically castrate themselves because of their tin-foil hat assumptions about a simple parental control feature . . . that is not even mandatory to use! Truly unbelievable!
People are getting upset because Apple is enabling more bad parenting practices. With good parenting, this wouldn’t even be an issue because the kid either wouldn’t be in this situation or would know how to handle it. They wouldn’t need their phone to snitch to their helicopter parent because they decided to choose for themselves if the photo was okay or not to view. The kid presses the button, sees the image, and then deletes it, reports it, or the phone had a false positive, but the parent just gets a notification that their kid looked at a nude and they go off the rails instead of the kid going to the parent and having a proper discussion about it. This is just going to make things worse.

If the kid really wanted to see the picture and not send a notification, they’d just wait until the sender could show them the picture in person. Apple is tackling this the wrong way just like they want to do with CSAM detection, and that’s why people are upset.
 
Ban kids from iPhone. Make it an adult only device. They don’t even need to be on the internet. The internet is not a safe place for kids and most parents suck at monitoring what their kids are up to.
Let’s go one further and just ban smartphones entirely. I see way more adults using their devices irresponsibly than kids. Thousands of people (including children) have died from adults using phones behind the wheel.
 
People are getting upset because Apple is enabling more bad parenting practices. With good parenting, this wouldn’t even be an issue because the kid either wouldn’t be in this situation or would know how to handle it. They wouldn’t need their phone to snitch to their helicopter parent because they decided to choose for themselves if the photo was okay or not to view. The kid presses the button, sees the image, and then deletes it, reports it, or the phone had a false positive, but the parent just gets a notification that their kid looked at a nude and they go off the rails instead of the kid going to the parent and having a proper discussion about it. This is just going to make things worse.

If the kid really wanted to see the picture and not send a notification, they’d just wait until the sender could show them the picture in person. Apple is tackling this the wrong way just like they want to do with CSAM detection, and that’s why people are upset.

So you're telling me people are saying they're going to completely leave the Apple ecosystem because some bad parents might go ape if their children choose to view a sexually explicit photo (that they were given fair warning about before viewing)? ? ? You've got to be joking.

The irony here is this is about the best balance you can achieve between parental monitoring and a child's "privacy" since they are given fair warning that viewing the photo will alert their parents. You see, Apple's goal here is not to "snitch" on kids but to discourage them from viewing the content in the first place.

Personally, I would never allow a preteen child to have their own iPhone without heavy safeguards in place. It's just irresponsible and negligent not to. If your concern is about parents going ballistic, then nothing Apple does or doesn't do is going to change that. Those types of parents will create their own ways of monitoring their child without Apple's help. Don't throw the baby out with the bathwater. There is no such thing as a technology that can't be exploited or abused. Yet we don't just eliminate technology because of that.
 
  • Angry
Reactions: Victor Mortimer
Ban kids from iPhone. Make it an adult only device. They don’t even need to be on the internet. The internet is not a safe place for kids and most parents suck at monitoring what their kids are up to.

Stuff that’s banned gets more popular because of that.
 
People no longer value their privacy as they once did. Then again, people didn't used to worship corporations like as some now do Apple Computer.

Apple Computer was renamed to Apple Inc. about 14 years ago. And worshiping is certainly not what’s happening here.
 
  • Angry
Reactions: Victor Mortimer
So you're telling me people are saying they're going to completely leave the Apple ecosystem because some bad parents might go ape if their children choose to view a sexually explicit photo (that they were given fair warning about before viewing)? ? ? You've got to be joking.

The irony here is this is about the best balance you can achieve between parental monitoring and a child's "privacy" since they are given fair warning that viewing the photo will alert their parents. You see, Apple's goal here is not to "snitch" on kids but to discourage them from viewing the content in the first place.

Personally, I would never allow a preteen child to have their own iPhone without heavy safeguards in place. It's just irresponsible and negligent not to. If your concern is about parents going ballistic, then nothing Apple does or doesn't do is going to change that. Those types of parents will create their own ways of monitoring their child without Apple's help. Don't throw the baby out with the bathwater. There is no such thing as a technology that can't be exploited or abused. Yet we don't just eliminate technology because of that.
The people leaving Apple are against the continuous march to authoritarianism, and part of the backlash here might actually be conflated opinions about the CSAM detection part and not so much about iMessage protection. I will leave Apple if CSAM detection gets implemented on-device.

I’m personally against iMessage protection because like CSAM detection, it’s fighting a good fight but in completely the wrong way. I was raised in such a way that I had my own private computer at age 13 with no restrictions whatsoever, but was responsible enough to never do anything on it that would get me in trouble. My parents trusted me to control myself, and I respected that.

It’s like guns. So many kids get injured or killed because they play with guns. Why do they? Because instead of being taught how to properly handle a gun and to respect it, many parents just tell their kids “don’t mess with it”. So, many kids think it’s like what they see in games and movies, and it’s all fun to be a rebel and play with something they’re not supposed to until the gun goes off. Again, I was taught how to handle guns, what to and what not to do with them, and had opportunities to properly and safe shoot them, and so had no desire to mess around with them when I shouldn’t have been.
 
The people leaving Apple are against the continuous march to authoritarianism . . .

I'm sorry, but giving parents a parental control option is not Apple "marching to authoritarianism". You and others are blowing this thing WAY out of proportion and letting your imaginations run wild. Maybe reconsider leaving when/if Apple actually does something wrong, such as censors for dissenting political content on everyone's devices, without them having any choice, and reporting them to the government.

I was raised in such a way that I had my own private computer at age 13 with no restrictions whatsoever, but was responsible enough to never do anything on it that would get me in trouble. My parents trusted me to control myself, and I respected that.

Yeah, and what year was that? Things are quite a bit different in 2021 than they were back in, say, 1997, when I got my first PC as a 13 year old . . . different both technologically and socially. And even though there of course were still dangers online back then, they have exponentially multiplied since then and many more parents are far more cognizant of them these days. And of course there's also the mobile aspect of the phone vs. a home-based computer.

For a parent in this day and age to give unrestricted/unmonitored Internet access to a child is foolish. Just because you might trust your child doesn't mean you don't put safeguards in place. You do that because you love them and don't want to see them hurt by foolish decisions they might make in the spur of the moment and/or due to peer pressure. It's the same concept as why you don't just throw a mixed group of middle schoolers together at a party wihout chaperones.

It’s like guns. So many kids get injured or killed because they play with guns. Why do they? Because instead of being taught how to properly handle a gun and to respect it, many parents just tell their kids “don’t mess with it”. So, many kids think it’s like what they see in games and movies, and it’s all fun to be a rebel and play with something they’re not supposed to until the gun goes off. Again, I was taught how to handle guns, what to and what not to do with them, and had opportunities to properly and safe shoot them, and so had no desire to mess around with them when I shouldn’t have been.

Guess what? You can still keep safeguards regarding firearms in your house (such as locking them up and supervising their use) AND teach your child how to respect them and properly use them. What you don't do is just open your gun cabinet and tell your child, "Ok, remember everything I taught you, and have fun. Your mom and I will be back after supper!" LOL!
 
Without completely giving away my age, I’ll just say that the iPhone was known when I got my first laptop.

The issue with authoritarianism, by the time it’s to the point where one can prove it’s happening, it’s too late. That’s why you’re seeing backlash to Apple’s child safety program. We can see where this will lead and are trying to stop it before it starts. But again, this has more to do with CSAM detection, not iMessage protection.

Of course responsible parents wouldn’t give the go ahead to use guns unattended. The point I was trying to make is that I was taught about guns and given the opportunity to safely shoot them with family members, and thus the desire to mess with them when parents weren’t around was gone. There wasn’t a shroud of mystery around them.
 
  • Like
Reactions: dk001
Don't really care about CSAM per se, but only as it pertains to being a trojan horse and grooming for other intrusive authoritarian garbage downstream. This won't stop here, eventually you could be compelled as a witness against yourself. The theme is always the same: "it's just" something to protect the kids, "it's just" an app to keep everyone "safe", it's just this, it's just that. Imagine some algo misidentifying you as something you're not. Who are you going to appeal to? These companies are faceless and completely unaccountable, they certainly won't answer to you. You go do you if you're comfortable with all of this, that's absolutely your right.
 
Without completely giving away my age, I’ll just say that the iPhone was known when I got my first laptop.

Your precise age isn't exactly highly confidential info ?

The issue with authoritarianism, by the time it’s to the point where one can prove it’s happening, it’s too late.

Then using that logic, we need to start boycotting all kinds of technology, because it *could* be a babystep that's part of some authoritarian conspiracy. Like I said, there's no such thing as technology that can't be expoloited or abused by people.

Of course responsible parents wouldn’t give the go ahead to use guns unattended.

So then why are people here acting like responsible parents should let their child dive into the deep end of the internet and social media unattended? Plenty of dangers there.

The point I was trying to make is that I was taught about guns and given the opportunity to safely shoot them with family members, and thus the desire to mess with them when parents weren’t around was gone. There wasn’t a shroud of mystery around them.

Yes, obviously parents should teach their children to use the internet/social media responsibly, but that doesn't mean you don't have safeguards/supervision in place as well.
 
  • Disagree
Reactions: dk001 and Romain_H
All depends on how it’s implemented.

Android has come a long way. Many of us are only with Apple due to inertia.
I'm only with Apple because on Android, my carrier is forcing me to use their homebrew Visual Voicemail app and on Apple, I can use the built-in one.
 
Personally, I feel that this is a fairly non-controversial topic, but I also think that this is because it's a fairly straightforward feature with a clear intended audience and the benefit is pretty clear-cut. This feature is being pitched at parents who would appreciate more support in managing their children, and children below the age of 12 likely don't have the finances to purchase their own smartphones to begin with, and are typically using hand-me-downs from their parents. It's a pretty compelling argument to stay within the Apple ecosystem and for the entire family to keep using iPhones.

When I prepare a presentation, I typically start by putting myself in the shoes of the audience and asking myself "why am I here and what's in it for me?". Alerting parents to when children under the age of 12 open potentially explicit photos is something that, in my opinion, needs to become common practice across the industry. It’s a “why did it take companies this long to figure this out?” type of thing.

That said, I do think that Apple's PR has done a horrendous job this year, especially by bundling this iMessage feature together with their far more controversial CSAM-detection feature, and I do understand why nobody really has any reason to support the latter. Even if I have no child porn on my apple devices, and even if I have nothing to lose from having my photos being scanned, it still brings me back to my initial question - what's in it for me? At best, there is no downside, which is no way to sell a feature.

My take on the CSAM-detection measures proposed by Apple is that it would actually be far less invasive than what other companies like Facebook are currently doing (basically scanning every single photo you upload to their service, not just for pornography but basically data-mining it for everything it is worth). Nor is Apple combing through your entire photo library. Instead, what Apple is doing is comparing hashes of known child pornography against photos you would upload to iCloud. Apple is not looking through your vacation photos for images of your 5 year old daughter in a swimsuit. Apple still won't know what your photos contain, both before and after they are uploaded to iCloud. In short, Apple appears to have come up with a way to find CSAM images uploaded to one’s iCloud photos without actually dissecting and probing photo libraries in a way that voids privacy, assuming it works as advertised.

Human intervention is positioned as a type of fail-safe designed to catch the very few false positives (false allegations of someone possessing CSAM), rather than being the first line of defence.

As for concerns about governments forcing Apple to use this feature to spy on their citizens, my response is that given the numerous hoops they would have to jump through, there are likely already more practical and easier ways of going about doing so than trying to retrofit Apple’s CSAM detection system or getting Apple to create a different version of CSAM detection. So while not impossible, I am willing to go out on a limb that a lot of these "what if" scenarios are simply slippery slopes. The story of a vengeful ex stealing your iPhone and storing child porn on it just so it gets flagged sounds more like a plot out of a Hollywood movie.

I don't see it as an invasion of my privacy, I don't think there's much opportunity for abusing it, but it still doesn't answer the initial question - what's in it for me as the end user? What's the benefit to me if I am cleared of not possessing child porn on my iPhone, when I was never under any suspicion of having any to begin with? Any pedophile aware of said feature is simply just not going to use iCloud to store any such material, and it simply feels more like a way for Apple to get law enforcement off their backs than it is about protecting users (and protecting us from what exactly?).

I guess my point at the end of it all is that this to me is a discussion that deserves to be a lot more nuanced than being for or against privacy, especially when Apple's proposed implementation looks to be a lot less invasive than other companies.
Why does every feature have to be about you? Just because there's no benefit to you directly, doesn't make it a bad feature. What it will do is catch the people who are stupid enough to put that stuff in iCloud. That's what it's for. It's not for you and it literally won't affect your life whatsoever, so why worry over nothing?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.