Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
See this is confusing because Apple has ALWAYS scanned your iCloud data and then given it to authorities to feds and local police . In this case they are using device AI to scan CSAM content which is exactly the same as the content they used to scan before.

Actually Apple has not. Currently Apple does not do server side scans like say ... Google. Apple does comply with warrants for iCloud data as Apple hold the iCloud encryption keys.
 
Last edited:
  • Like
Reactions: Jstuts5797
For me its the gateway to other invasions....

Probably started with google, "we have AI scan your emails to better serve you ads". No human involvement but I still don't want anyone doing that nor do I trust that anyone, especially Google, is not harvesting other data from my personal emails.

Now Apple wants to scan hashes of pictures that will be sent to iCloud because they don't want CSAM on their servers (nor does anyone). Next to no human involvement until they think your a pedo, then someone looks through the flagged pictures to see if they are CSAM, there is a chance the picture wasn't CSAM but now some stranger has already looked at YOUR PERSONAL PICTURES. Might have been a picture of a flower pot, might be a pic of a far more personal nature, no way to know until its too late.
One in a trillion chance of that happening.

Also, they already have the technology in place and running for years now. What's to stop them from adding other hashes to it now? Ever thought of that?
 
  • Haha
Reactions: Cycom
Point me to when they were scanning on-device, looking for certain images, before this announcement.
Remember when Apple announced instead of cloud power siri will do all processing locally or that local AI will read the text in images using device power instead of cloud? similarly device AI will scan icloud image hashes for bulk of images which match hashes.
 
  • Haha
Reactions: dk001
Point me to when they were scanning on-device, looking for certain images, before this announcement.
They were scanning every single photo uploaded to iCloud since 2019 for CSAM, just like they're going to do in iOS 15. So you weren't "safe" before either. Don't use iCloud photo library or don't update to iOS 15 to keep things the way they are now (files are still scanned and can still have 1 in a trillion false positive).
 
  • Like
Reactions: Nightfury326
Point me to when they were scanning on-device, looking for certain images, before this announcement.
One of the big announcements at WWDC was the ability for the phone to recognize plants and animals....but god forbid they try to stop child pornography.

Siri also performs on device functions that search your phone.
 
One of the big announcements at WWDC was the ability for the phone to recognize plants and animals....but god forbid they try to stop child pornography.

Siri also performs on device functions that search your phone.
And spotlight

Edit: This makes the "what happens on your phone stays on your phone" ad even more true in my opinion. Almost everything that was done server-side is now handled by your phone for better privacy.
 
  • Like
  • Haha
Reactions: dk001 and MozMan68
Apple doing the scan on device is a big deal for several reasons.

First, ask why they don't they just do this in the cloud? They are cheap. They want to leverage the local neural engine in the iOS device to do this workload. At scale it's a huge computational load and they want their users to pay for it.

Further, consider how much goodwill they just nuked for distributed use of those A-series chips for something positive like protein-folding analysis to help with COVID. No, instead Apple chose to treat their entire userbase like suspected pedophiles. That burns a lot of charity.

The fact that the scan is done on the user's phone, without their consent, and *prior* to uploading makes this a warrantless search that Apple is conducting as a fishing expedition on behalf of law enforcement.

Law enforcement cannot do this without a warrant which requires probable cause.

NCMEC is a private foundation, but is funded by the US Justice Department. Anything Apple refers to them will be reported to FBI or other agencies. It's also run by longtime infomercial hawker John Walsh, father of Adam Walsh.

People thinking that Apple will not make a mistake really overestimate the level of care Apple will use. Likely, their employees will never actually see the CSAM photo. They will simply look at the match count and forward to NCMEC for review.

Comparisons to cloud-hosted data being scanned are simply not the same as what Apple is doing here and the way they dropped this has been unbelievably badly handled.

This will keep building as a PR disaster and we will see if Stella Low can handle it. She's from the UK and maybe she just doesn't understand the Fourth Amendment landmine Apple just stepped on. She likely was on a team that signed off on this whole thing in advance. Jobs' longtime PR chief Katie Cotton (left in 2014) would have seen this coming.

Then there is the mission creep of adding new hashtables of wrongthink to check for, "for the children" or to protect you against terrorists. The precedent that Apple can use our personal resources to incriminate us without cause is intolerable and is destructive to the brand.

iCloud Photos ON = You giving consent. It will be iOS 15 user agreement that when you click icloud photos you are giving Apple the right to have your device scan for CSAM. Apple can't be sued because iCloud photos is Opt in and the USER makes a choice to allow device to scan image hashes for CSAM
 
They were scanning every single photo uploaded to iCloud since 2019 for CSAM, just like they're going to do in iOS 15. So you weren't "safe" before either. Don't use iCloud photo library or don't update to iOS 15 to keep things the way they are now (files are still scanned and can still have 1 in a trillion false positive).
If this is true then what is so novel about this announcement? They’re literally telling everyone that they have added hashes specifically looking for things to report. This can be manipulated by any government according to their standards in that country. I wasn’t worried about trees and animals.
 
And spotlight
Right...and that is what is so hilarious about the image hashes. Apple is only taking all of these multiple steps with this project to avoid any incorrect access due to the sensitive nature of what they are doing as well as the possible outcomes if incorrect images are hashed.

They could always do this, governements can still alwways ask them for access, hackers can still hack into iCloud.

On-device scanning isn't new....It's the steps that follow that Apple is being extra careful about so the user's privacy is protected as much as possible.
 
If this is true then what is so novel about this announcement? They’re literally telling everyone that they have added hashes specifically looking for things to report. This can be manipulated by any government according to their standards in that country. I wasn’t worried about trees and animals.
Since 2019, every photo uploaded to iCloud was checked against the same list of known CSAM images using hashes. If they wanted to expand the list of hashes, they could've done that since 2019. The only difference now is that check is done while the photo is being uploaded, not after.
 
You're just all over the place aren't you? The thread is about Apple's CSAM scanning. Not other companies. More potential for abuse on another platform does zero to mitigate potential for abuse on Apple's platform.

How about I don't want my device scanned by Apple or anyone else without my consent?

I saw a linked article earlier (forget where) that said gmail was scanning CSAM since 2014 or something, and resulted in evidence in one case. One.

What is all of this actually going to accomplish? Open the door to further privacy intrusions under the banner of "think of the children" while protecting no one from anything. CP *******s aren't sending their goods through iCloud my man.

Want to search my ****? Get a warrant.

That leads to the big "WHY?" question. Why is Apple choosing this method? Seeing the results from Google I have to really wonder and why not on the server side where everyone else is? This looks like a lot of effort for pretty much no gain.
This scan pre-encrypt on my device by Apple is very concerning. I have a lot of personal and proprietary info on my devices. Lastly, who provides Apple with the CSAM hashes? The Government?

Maybe I am paranoid but I can see "device side" being open to abuse. Can you say "FISA"? How about another Nation State?
 
And that’s my problem with it.
Okay, don't update to iOS 15 then. iOS 14 will still receive security updates (if you trust security updates that is). I mean, after-all, aren't you trusting EVERYTHING Apple says and does on your device?
 
If this is true then what is so novel about this announcement? They’re literally telling everyone that they have added hashes specifically looking for things to report. This can be manipulated by any government according to their standards in that country. I wasn’t worried about trees and animals.
It's novel because Apple is taking the extra step to take illegal info and pass that along to the authorities.

People are choosing not to believe that it is a one in one trillion chance that an innocent persons account is flagged (at which point, by the way, an Apple employee gets involved to verify...no one is automatically forwarded to the authorities).
 
That leads to the big "WHY?" question. Why is Apple choosing this method? Seeing the results from Google I have to really wonder and why not on the server side where everyone else is? This looks like a lot of effort for pretty much no gain.
This scan pre-encrypt on my device by Apple is very concerning. I have a lot of personal and proprietary info on my devices. Lastly, who provides Apple with the CSAM hashes? The Government?

Maybe I am paranoid but I can see "device side" being open to abuse. Can you say "FISA"? How about another Nation State?
How would "server-side" not be able to be abused though? Why couldn't governments force apple to add more hashes to the list and expand the search for other content?
 
That leads to the big "WHY?" question. Why is Apple choosing this method? Seeing the results from Google I have to really wonder and why not on the server side where everyone else is? This looks like a lot of effort for pretty much no gain.
This scan pre-encrypt on my device by Apple is very concerning. I have a lot of personal and proprietary info on my devices. Lastly, who provides Apple with the CSAM hashes? The Government?

Maybe I am paranoid but I can see "device side" being open to abuse. Can you say "FISA"? How about another Nation State?
It's so they can protect your privacy even more...you prefer someone scanning every single photo you have in iCloud, or looking for a very specific set of data that is only flagged AFTER or IF it is passed along to iCloud?
 
It's novel because Apple is taking the extra step to take illegal info and pass that along to the authorities.

People are choosing not to believe that it is a one in one trillion chance that an innocent persons account is flagged (at which point, by the way, an Apple employee gets involved to verify...no one is automatically forwarded to the authorities).
This is the response I was looking for. “We will now report these images to the authorities”. Just wait and see what that expands to.
 
Data on the iCloud server farm is encrypted, you can not pick out an individual image from an encrypted format (at least it's not supposed to work that way😄 ). The AI on the device hashes images shared with iCloud and then compares those hashes to the CSAM hashes, if the hashes match then it sends a safety voucher up for review. If there's no match, it does nothing with the hashes/images. They would have to remove the in-transit encryption.

It is encrypted however Apple hold the keys to the server side info. They can unencrypt at any time; like from a warrant.
 
  • Like
Reactions: GBaughma
This is the response I was looking for. “We will now report these images to the authorities”. Just wait and see what that expands to.
Only if it is multiple images in a single account...and yes, if they match the database images, that person is in possession of illegal child pornography. You DON'T want them reporting those people to the authorities?

Or is your concern that the person may in fact be innocent and was somehow hacked to have illegal images on their phone? Are we back to fantasyland again?
 
Only if it is multiple images in a single account...and yes, if they match the database images, that person is in possession of illegal child pornography. You DON'T want them reporting those people to the authorities.

Or is your concern that the person may in fact be innocent and was somehow hacked to have illegal images on their phone? Are we back to fantasyland again?
Read the last sentence of the post you quoted in order to understand my concern.
 
  • Like
Reactions: Jstuts5797
iCloud Photos ON = You giving consent. It will be iOS 15 user agreement that when you click icloud photos you are giving Apple the right to have your device scan for CSAM. Apple can't be sued because iCloud photos is Opt in and the USER makes a choice to allow device to scan image hashes for CSAM
Actually, the installation of iOS 15 is the consent, but yes.
The hash database will be there regardless, as well as the ability for it to be invoked via API.

If you trust Apple legally but do not want your photos scanned, do not enable iCloud Photos.
If you don't trust Apple's software engineering on this feature, do not install iOS 15.
 
Read the last sentence of the post you quoted in order to understand my concern.
I read it...what are you inferring? Again, are we living in the real world where illegal activities should be reported? Or are you living in a fantasy land where Apple, governments or hackers are changing the whole hashed picture system Apple has set up to find an report legal yet unaccepted evidence??

It has been said/agreed to before by me and others...they can already do this. How does this particular option which Apple has added make that any easier??
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.