Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is still very creepy. Not sure what exactly Apple's motive is.

I'm just not going to allow Apple to go through my iPhone. Hopefully, that option is given.

there is no option if you use photos or icloud photos. however upon reading more about this it apparently works by encoding your image to a "hash string" which isn't the actual image and then comparing that hash string to a list of KNOWN child abuse images from the internet (so unique images of your children playing in the bath aren't even an issue) and if you have a certain number of exact matches then and only then do they "manually" (ie. a person) look at the images and determine if a crime is a occurring they need to report. they claim there is a less than 1 in a TRILLION chance a flagged account doesn't actually contain significant amounts of true child abuse imagery. i need to read more but perhaps it isn't as creepy as it first sounded to me... but they need to be very, very transparent of what is going on.
 
There are sha1sum reverse hash search engines out there on Tor, where you can put in a hash and it gives you the canonical URL of the origin.

So I don’t know how well the fruits of this, albeit morally righteous endeavour would play out, if all of that source material were to be, for example, uploaded to Pastebin.

Not good, I would expect.
 
Well, it's not even a question about just imagery. They could start tagging everything that someone doesn't like, like a repressive government suddenly requiring anything that has certain phraseology in it in text format. Then, that stuff could be cataloged and reported on to that government when it comes to certain iPhone users within their political control.
 
Okay let’s leave aside the child pornography part of it for a second. That’s hideous and needs to be addressed but again, leave it aside.

Apple is scanning our photos for
"potentially illegal content, including child sexual exploitation material”

It includes child pornography but what else are they scanning for that they’re not telling us about? I’m a middle aged dull mom so my photos are all goofy family photos and photos of pets and some landscapes and architectural stuff.

But good lord some of the spring break photos some of you folks must have…😆 Big Bro Tim Cook is giving you some serious stink eye!
 
Am I misunderstanding or is Apple planning to scan every single one of your private photos in the hope there’s someone out there stupid enough to upload an unedited version of an already known abusive photo?

Why aren’t they scanning all your keyboard inputs for online bullying?
 
Are you okay with Apple looking at your photos and wondering if you're abusing or not abusing someone?

Once you get used to this, the goal post will be moved to other political wrong-thinking. Not Coofid compliant? We'll call CPS.

I have never used iCloud, nor will I ever enable it on any of my Apple devices.
 
Don’t understand the big deal.

This isn’t people at Apple scanning people’s photos, it’s on-board AI that’s looking for matches with an existing database.

Your photo library is already being scanned for faces (People feature) and other points of interest.
Because instead of trying to understand what is going on people would rather just freak out about everything ...
 
If there is a match, that photograph is uploaded with a cryptographic safety voucher, and at a certain threshold, a review is triggered to check if the person has CSAM on their devices.
Your photo has a hash collision with a known CSAM. It then gets uploaded to Apple for review. Their server is already or gets compromised and a hacker takes your photo. They then distribute to the world.

Great intentions, but this is just bad.
 
You should let law enforcement install cameras in your home then, so they can make sure you are not doing anything illegal while you take a shower, for example. After all, you have nothing to hide, do you?
You vill eat ze bugz!
 
Last edited:
Apple should add scanning for:

1. Photos of the confederate flag.
2. Photos of people not wearing Covid masks.
3. Photos of Chinese people disrespecting the Chinese government.
4. Photos of Middle eastern women not wearing burkas.
5. Photos of a group of people with too many whites, not enough blacks.
and for comments such as these.
 
Tell me the Gov't won't force Apple to scan for other non-child porn related images that the Gov't deems in the interest of National Security. Of course that will all be done under cover of darkness via a FISA warrant so you'll never even know your rights are being trampled on. They just need to get the technology installed, tested and working first, then it's welcome to Stasi-ville.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.