It does not surprise me one bit that most of this crowd is actually opposed to these features.
Yes, there is a need to be rude. Reading comprehension is so basic, and it’s frustrating when people flip out before taking the time to educate themselves. Gotta call out that behavior because it’s seriously warpedNo need to be rude.
APPLE YOU DO NOT HAVE THE RIGHT TO VIOLATE MY PRIVACY TO ANYTHING ON MY DEVICE.
Typical Tim Cook BS - “all about human rights,” then supports PRC. “All about privacy,” then scans all images on someone’s phone.
A lot of you guys didn't read the whole article or misunderstood something. This only flags images that match a database of known child abuse images or "visually similar" images (cropped / resized / color change / etc.). There's basically no chance that your little grandson's first bath will be flagged.
This really isn't the concern here. I think the concern is that this is a slippery slope.
I read the article and still have a concern. It's a slippery slope from identifying hashes in a database to other photos on your iphone, to speeding and detecting drunk driving patterns. (not that that would be a bad thing).If you read how the process will work, this kind of picture would never been flagged. That doesn’t make the move less controversial or more acceptable but Apple wants one of your saved pictures to match a database of child abuse photos before being flagged to anyone. Nudes of you or photos of your childs taking a bath would not match that database in the first place
No, the negative comments here are from people who recognize, based on long experience, what happens once the camel’s nose is under the tent!The negative sentiment here is what happens when people don’t read the article, and don’t understand the parts they do read. Learn how to read then come back and comment
did you even read the article? its not "scanning photos" but hashes of known shared child pornography images.What about photos of "Baby's first bath" will those users get treated as child exploitation?
No… only images with hashes in the database will be reported. Not just any image. It’s the only way to do it while respecting privacy, they don’t want to actually scan your photosAbsolutely. False positives are a legit concern. But the early responses seemed to think that any borderline child images were going to be reported.
The CSAM thing doesn't detect/determine content of images. It checks photos against a database of specific (actively circulating) child abuse images.
Not to say there aren't legitimate concerns, but worrying that it is going to somehow flag your own kid's photos is not one of them.
(The child safety thing does detect, but seems the worst that does is through up a warning/blurring if you have it on)
Awesome! Thank you for clarifying. I appreciate it.The CSAM database is of known abuse images. So regular nudes shouldn't trigger them.
And the blurring of explicit photos seems to be a parental control thing.
No, the negative comments here are from people who recognize, based on long experience, what happens once the camel’s nose is under the tent!
If you read the article carefully you’ll see these kinds of photos will not raise an alert.
Hopefully those images aren’t easy to download. And if you knew the link to them, you are a criminal already. Also, that’s why Face ID and passcodes existSo a malicious person grabs your phone, downloads a CSAM image onto it, and what happens next? The FBI shows up and you become Jared Fogle's roommate?
No, too far, Apple.
What is going to keep you from scanning my library for NeuralHash matches against politics you don’t like? Or criticism of mainland dictatorial China?
if that doesn’t happen in the US, what will keep other countries (read above) from doing just that to their citizens?
So a malicious person grabs your phone, downloads a CSAM image onto it, and what happens next? The FBI shows up and you become Jared Fogle's roommate?
Still a violation of privacy assuming they do it with images on the phoneNobody is violating your privacy. You will only be affected if you are a criminal with child photos, and then you should be arrested, shamed, and jailed
As long as their false positive rate is zero, I have no problem with this. I’d hate for my childhood photos or medical photos to be flagged. I use iCloud for it’s security here.
Perhaps they’ll only divulge whether any images have been flagged or authorities IF they ask specifically and have a warrant. Fine by me. I don’t see Apple referring people to authorities blindly.
calling it now......Read the whole article carefully. While there are plenty of legitimate concerns about this, I don't think false flags are one of them. They are scanning for known child abuse imagery or visually similar edits of those known images (e.g. cropped, filtered, etc.). So your personal family photos aren't going to get flagged.
In other words, they're not scanning for something vague like "a picture of a naked child".
How so? They are hashing the photo and looking at the hash, not at the photo.Still a violation of privacy assuming they do it with images on the phone
The only people who should be opposed to these specific features are people who are committing crimes, and people who don’t care about the safety of their children.