Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yuck! CSAM. First wave of Roll Out, Watch! Don’t let Apple use the word “Kids” as an excuse. This is CSAM. It’s all related, connected.

Is there a way to disable this feature? Or is it set by default by Apple?

SMH!


Update: 11:24AM PST.

So many disagreements… you guys don’t play around here.

If this is all true… then this is a good addition. CSAM is way more controversial honestly. I hope it is not connected to CSAM.
You're catching a lot of flack, but I'm with you on this one. There is a reason why they call this stuff a slippery slope.
 
As predicted, Apple is going to sneak CSAM into iOS peacemeal after the backlash (and rightfully so) they received when they tried shoving it in all at once. People, the EFF and countless others need to keep Apple's feet to the fire and demand they abandon this backdoor spyware IMMEDIATELY & COMPLETELY. The Apple Apologists need to wake up and realize what their beloved company is doing.

Apple can no longer be trusted with privacy or security.

This feature will also detect adult nudity and pornography. It has nothing to do, technically, with the CSAM Detection System.

And it's not a backdoor since it must be enabled and can only do so with an account which is part of a family plan.
 
  • Like
Reactions: NC12 and NicolasLGA
Someone help me out here...whether its Opt in or out....the feature is officially programmed and built in, thus it would come to down to trusting apple / backdoors, right?

If you are using a device in which one company controls all the hardware, the firmware, the operating system, the APIs, and even the application distribution, you have to trust this company or stop using the device.

Such a company has millions of ways to fool you if they want too.
 
I predict that there will be a rush of App Updates, targeting iOS 15.1, as many, me included, believe many Apple consumers won't go past iOS 15.1 !

And, if Apple doesn't provide assurance that new iPhones won't include iOS 15.2, iPhone Unit Sales will "stall-out" considerably !

Make NO mistake, a Death Blow to iPhone Unit Sales is coming !
Go buy a Samsung Galaxy .. Problem solved
 
  • Like
Reactions: BromoCop and NC12
It is already an opt-in feature. It will be disabled by default, parents who want the feature can enable it

Yes, I get that.
My take is that Apple should not be trying to play parent. That is NOT their purview.
Now take this functionality and make it an app.

Instead Apple builds it into the OS adding more bloat.
So those that have this on for the child phone, the kids will mostly blow right through it IF they are even using Messages.

Not seeing the sense in this.
 
Someone help me out here...whether its Opt in or out....the feature is officially programmed and built in, thus it would come to down to trusting apple / backdoors, right?
The difference between this and CSAM is that this does not compare hashes or report to anyone. It uses an algorithm to search for any nude just like it finds cats in your photos. The result is a burred image and a message. So even if there is a back door where someone figures out how to turn this toggle on in your phone with out you knowing, the worse thing that will happen is you will get some blurred images. At that point you will know you were hacked and get it fixed. No real big consequence here if something should go wrong.

CSAM on the other hand compares specific images hashes that you must trust are of child porn. No way of knowing for sure what its looking for making it great for spying. Especially when Apple insists that if follows the laws of each country they're in. It then reports its finding to someone (where ever the laws of that country says it should go to, guised as a Child protection service). Hey... What could possibly go wrong with that.
 
  • Like
Reactions: NC12 and fwmireault
Ok so this update is going to be the one that breaks the encryption of iMessage and makes it less secure. Got it.

So that means the next update our phone will spy on us, this one breaks iMessage.
 
  • Haha
Reactions: NC12
A lot of knee jerking 🦵🏻 going on here from people who haven't even read the article!

Sounds like more than just knee jerking.

The amount of people losing their minds over a feature meant to protect children from being sent or requested to send illegal materials is eye opening to how much society has underestimated the prevalence of child sexual exploitation.
 
Someone help me out here...whether its Opt in or out....the feature is officially programmed and built in, thus it would come to down to trusting apple / backdoors, right?

Using Apple has always been about trust.

We Apple users buy Apple gear because we trust Apple, and I think that this is one of the really important aspects of Apple that people really don't understand. That Apple really goes the distance in building trust relationships with their customers.

This doesn't mean that Apple is perfect or beyond reproach. Far from it, but Apple has been very good in the areas that I do care about, while I find I can still tolerate the areas in which they are weak in. Their relatively few competitors are the inverse, and I think you will find that while a lot of Apple users may struggle to articulate this point, it will nevertheless come out along this line if you poke them the right way.

It's the same as how people connect with their babysitter or hairdresser. I don't evaluate them solely on objective metrics. Instead, we connect based on how well we communicate, whether we trust them to be truthful and fair with us, how well we approach a given problem, and so on. That doesn't mean I am a cultist to the 40-year-old down the street who cuts my hair out of her apartment; it just means that my mom and I trust her and have built a rapport with her, and that's about 90% of what we're buying as part of that service.

To sum it all up, Apple users buy trust, not specs.

And Apple has my trust in this; and in the CSAM feature if and when it ever gets rolled out.
 
If you don't trust Apple, how do you know they didn't add all sorts of creepy tracking/spying/reporting stuff five years ago, and they're just keeping it a secret -
I'm certain they did.

Why is everything fine up to this point, but NOW you think they're out to spy on you?
Never said it was. I'm positive they are full of it on all of these topics, and always have been.
 
  • Haha
  • Disagree
Reactions: NC12 and kurai
I think some people doth protesteth too much. This feature just puts up a warning for people under 13 when a picture contains nudity. Given how many creepy dudes send pics of their junk to women on Tinder, I wouldn't be surprised if some dude likes to send pictures of his junk to children 13 and under (which is already a felony in all 50 states). This just keeps people who opt in from seeing junk pics.

If your hill to die on is "Kids should be able to see unsolicited pictures of someone's junk without warning" then have fun dying on that hill.
 
  • Like
Reactions: NC12 and CarlJ
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.