Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple clearly never went to a primary school in the pause, and just listened to kids public conversations.
It can be shocking!
Their conversation is often just ************************************************
They aren’t all so innocent as Apple think they are.
 
They can search it all, they own all data on the icloud and wants to own all the data on your device.
Not only images, all types of data, probably even incl. industry espionage.
Apple doesn't own all data on their iCloud servers. Apple rightfully has access to the photos stored on their servers. Big difference between owning and having access.
 
At the current time, there is no sign of CSAM wording in the iOS 15.2 beta, so Apple may first introduce Communication Safety before implementing the full suite of Child Safety Features.
Apple won't issue the wording. The wording will be "we have a warrant to search your premises" or "you are under arrest" issued by the cop who comes to the door of the person who uploads it.
 
I just don't understand, apple says the CSAM stuff is done on device and known images are reported. But in that case, couldn't apple use "on device machine learning" as an excuse to carry out anything a government wants them to do by law? What's to stop a government from making a law that requires apple to report images that promote homosexuality? I'm usually not a slippery slope kind of guy, but this concerns me.
Nothing stopping that kind of scenario. Apple would have to honor such laws unless it decided to stop doing business in that country.
 
Vulnerable people will die as a result of this
That's exaggerating a bit. I see your point, but someone would have to be under 13 AND receive a photo flagged as nude AND open the photo AND click through the warning before parents are notified. Not that it can't happen especially to young people who just blindly smash buttons, but I doubt the false positive rate among photos received by 12 year old children would be that high to be a huge concern. You also have to consider that it could actually protect children who legitimately get harassed and are too scared to talk about it.

That said, I do have total sympathy for LGBT children and think anyone who disowns or harms their child because of their sexual or gender identity is a horrible person and should not be allowed to have kids. But for now I'm slightly more positive given all the safeguards that have been put in.

And so far none of the documents say that the actual photo will ever be stored unencrypted on Apple's servers or sent to the parents - of course, parents can simply take the child's phone, but at least there aren't wider privacy implications of this (even though your account would have to be marked as 12 or younger for this system to kick in anyway, for ages 13-17 it's completely on-device and the parents are never notified).
 
Nothing stopping that kind of scenario. Apple would have to honor such laws unless it decided to stop doing business in that country.
Right. Look the US government is very flawed, but I’m not cynical about them abusing something like this. This feature probably won‘t affect any US citizens except actual child sex offenders.

But I’m looking at the bigger picture. I can see the governments of Russia, Iran, Saudi Arabia, etc. abusing this technology.
 
  • Like
Reactions: Smooch
but we have not been able to confirm that the feature is active because it requires sensitive photos to be sent to or from a device set up for a child
Surely you have a spare iPhone you could set up for a fake child to test it?

This should serve as a reminder to all that Apple not only has the ability to find child pornography, but any sort of pornorgraphy.

Perhaps they already do...

You can search for "Sunsets" and "Beaches" in Photos, maybe Apple is also keeping track of other types of images at the same time.
Interesting idea. I’ve never tried searching for “nude” or “breasts” or “penis” etc in my photos. “Cats” works.

Pedantic. It's pretty obvious what people mean.
The problem with that argument is “obvious” is subjective. What’s obvious to you isn’t always obvious to somebody else coming from a different context. Even if you can’t think of an example of someone who might not find this one obvious, the principle applies in many other situations, so the habit of using careful, accurate, unambiguous wording is a good one to be in.
 
You’re all afraid of this so-called slippery slope, as though Apple doesn’t already have the power to spy on you if they wanted to! How about pausing for just a moment, using your own head (instead of just parroting what everyone else says), and taking a look at what this technology actually does… because from where I’m standing, it’s helping to address a huge problem and in a pretty sensitive way. Young people have ended their lives over online bullying and the sharing of nude images. But don’t worry yourselves about that… go on, grab your popcorn, or rant and rave about how unfair this is for you, you poor, poor entitled iPhone owners.

(For the record, I also support the far more controversial CSAM technology, but I’m not talking about that here. I’ve already posted at length about that elsewhere.)
 
This screams "some Karen" from California all over ?

That being said... My fiancé is a social worker and judging from the stories that I have to hear on a daily basis.... Some parents need help being parents

As long as these "safeguards" can't be exploited by law enforcement without a legal and highly justified warrant... have at it
 
You’re all afraid of this so-called slippery slope, as though Apple doesn’t already have the power to spy on you if they wanted to! How about pausing for just a moment, using your own head (instead of just parroting what everyone else says), and taking a look at what this technology actually does… because from where I’m standing, it’s helping to address a huge problem and in a pretty sensitive way. Young people have ended their lives over online bullying and the sharing of nude images. But don’t worry yourselves about that… go on, grab your popcorn, or rant and rave about how unfair this is for you, you poor, poor entitled iPhone owners.

(For the record, I also support the far more controversial CSAM technology, but I’m not talking about that here. I’ve already posted at length about that elsewhere.)
It’s not about whether Apple wants to spy on you, it‘s about foreign governments requiring Apple to spy on you. Foreign governments will try to abuse this technology. Apple already censors the app store to appease Russia and China.
 
I understand wanting to filter material for children, but that really should be the parents’s responsibility. Companies should probably filter certain materials within their corporate scope of responsibility, for example if someone was breaking the law on Facebook and trying to prey on children, then I think Facebook should ban them. But trying to police the entire internet is an impossible task, and it isn’t the corporations’s responsibility to police it. They should report suspicious activity to the parents and authorities though. As the old saying goes, the road to hell is often paved with good intentions. Ultimately, the parental units are responsible for protecting their kids. It is far too easy to let kids entertain themselves on the internet so as not to be constantly bothered by them. However, if they brought them into this world, then they damn sure should watch over them and protect them. I think Apple is confusing good intent with responsibility. It is a thin line between good intent and encroaching into someone else’s business. Perhaps Apple should try to bring together many of the major tech companies with parents and legal advisors and law makers and approach it from that angle.
 
It’s not about whether Apple wants to spy on you, it‘s about foreign governments requiring Apple to spy on you. Foreign governments will try to abuse this technology. Apple already censors the app store to appease Russia and China.
Nothing new here. Governments (foreign or otherwise) already pressure Apple for access to our private data. At some point you either need to trust Apple’s public statements about this, or throw your phone away and go live in the wilderness. (A lot to be said for that actually!)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.