Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

hasanahmad

macrumors 65816
Original poster
May 20, 2009
1,429
1,573
A Case could be made without the outrage that even iCloud data shouldn't be scanned but thats not what the outrage is about. The primary objections of pundits and some users is that the scan is occurring on device rather than on iCloud.

so my Questions and confusion rather is this:

If the only available data TO scan for Apple to do on device is the exact same as the iCloud data , why does it matter where the scan occurs to users? especially if the scan is done by Device AI and Apple is only contacted when there is a BULK of hashes per device which match the one in Apple hash servers.
The users are objecting why the scan isn't occurring on servers. Well. It would be the exact same data as the one available to scan on iPhone. All the other data is still locked out to Apple

another question I would ask is why the outrage against Apple specifically on doing specifically photo hash only scanning when in fact Google scans private emails and all other online content for more categories than CSAM? Microsoft scans all online storages for more than CSAM? Amazon scans private data on all its online drives? FB and Twitter scan Private DMs and Facebook Messenger chats for CSAM? So why is the outrage fixated on Apple?
when in fact Apple is not only scanning for a lot less but they have said they will not expand the category of what they scan and who to give the scans to. You could always argue you don't trust Apple but if thats the case , if you dont trust Apple, then NO tech in the mainstream industry is better for you.

These questions above are different than WHy they are scanning to begin with but to that I have a question as well. If Apple is not allowed to scan anything , how does it stop CSAM which is the worst of society.
 
I've seen you posting a lot in the existing threads, so hope this response is helpful. To your first question, I gave my answer in the other thread.

another question I would ask is why the outrage against Apple specifically on doing specifically photo hash only scanning when in fact Google scans private emails and all other online content for more categories than CSAM? Microsoft scans all online storages for more than CSAM? Amazon scans private data on all its online drives? FB and Twitter scan Private DMs and Facebook Messenger chats for CSAM? So why is the outrage fixated on Apple?
when in fact Apple is not only scanning for a lot less but they have said they will not expand the category of what they scan and who to give the scans to. You could always argue you don't trust Apple but if thats the case , if you dont trust Apple, then NO tech in the mainstream industry is better for you.

Why is there outrage specifically against Apple? This part seems easy. Apple has billboards and advertisements bragging about how privacy conscious they are. Even just 2 months ago they launched their "Mind Your Own Business" ad:

Microsoft, FB, and Twitter made no such claims about privacy. When you're on their platform, you're on your own. People specifically bought iPhones because "what happens on your iPhone, stays on your iPhone"

So that part at least seems clear. People disagreed with Microsoft, FB, and Twitter, and that led them to Apple. Now Apple has changed its tune.


1628558938063.png
 
I switched to iCloud photos about 5 years ago, obviously to easily manage my photos between all my devices and Mac. I don’t think I could go back to using a non internet phone, and I am not even a heavy smart phone user.

I do use google services as well, because I think they are the best. I’m not nearly as anti google as most here. But at least I know what I got with them. I don’t think there is a way around this for me, switching to a google device running Android isn’t any more private.
 
  • Like
Reactions: one more
A Case could be made without the outrage that even iCloud data shouldn't be scanned but thats not what the outrage is about. The primary objections of pundits and some users is that the scan is occurring on device rather than on iCloud.

so my Questions and confusion rather is this:

If the only available data TO scan for Apple to do on device is the exact same as the iCloud data , why does it matter where the scan occurs to users? especially if the scan is done by Device AI and Apple is only contacted when there is a BULK of hashes per device which match the one in Apple hash servers.
The users are objecting why the scan isn't occurring on servers. Well. It would be the exact same data as the one available to scan on iPhone. All the other data is still locked out to Apple

another question I would ask is why the outrage against Apple specifically on doing specifically photo hash only scanning when in fact Google scans private emails and all other online content for more categories than CSAM? Microsoft scans all online storages for more than CSAM? Amazon scans private data on all its online drives? FB and Twitter scan Private DMs and Facebook Messenger chats for CSAM? So why is the outrage fixated on Apple?
when in fact Apple is not only scanning for a lot less but they have said they will not expand the category of what they scan and who to give the scans to. You could always argue you don't trust Apple but if thats the case , if you dont trust Apple, then NO tech in the mainstream industry is better for you.

These questions above are different than WHy they are scanning to begin with but to that I have a question as well. If Apple is not allowed to scan anything , how does it stop CSAM which is the worst of society.

The outrage exists because a) its Apple and b) its Apple and "anti privacy" even thou as you point out, its mostly on device and apple has stated anything in iCloud thats not encrypted end-to-end can be handed over to the proper authorities if requested. With CSAM, I see them giving authorities less data cause now they can "Detect the bad stuff" and hand that over only.

Another reason for the outrage is because Apple is huge on Privacy and something like this could lead to them "flipping on the switch" on anything they want to scan. I have faith Apple wont but as a person who cares about privacy and security, its always a possibility.

At the same time, I dont store any of my photos in the cloud and anything thats not end-to-end encrypted / anything I dont want in the cloud (any cloud not just apple) isnt in the cloud :). The best way to make sure your data's not accessible is to either have it behind lock and key or just dont put it on the cloud 🤷
 
so your argument is Apple should not scan for CSAM at all ? that becomes more a moral argument not a privacy argument
Not sure how you got that from what I said. You asked "why the outrage against Apple specifically" and I answered you that Apple has advertised as pro-privacy and this CSAM stuff seems like an invasion of privacy. Let me know where the gap is in our discussion.
 
I've seen you posting a lot in the existing threads, so hope this response is helpful. To your first question, I gave my answer in the other thread.



Why is there outrage specifically against Apple? This part seems easy. Apple has billboards and advertisements bragging about how privacy conscious they are. Even just 2 months ago they launched their "Mind Your Own Business" ad:

Microsoft, FB, and Twitter made no such claims about privacy. When you're on their platform, you're on your own. People specifically bought iPhones because "what happens on your iPhone, stays on your iPhone"

So that part at least seems clear. People disagreed with Microsoft, FB, and Twitter, and that led them to Apple. Now Apple has changed its tune.


View attachment 1816700
See this is confusing because Apple has ALWAYS scanned your iCloud data and then given it to authorities to feds and local police . In this case they are using device AI to scan CSAM content which is exactly the same as the content they used to scan before.
 
Not sure how you got that from what I said. You asked "why the outrage against Apple specifically" and I answered you that Apple has advertised as pro-privacy and this CSAM stuff seems like an invasion of privacy. Let me know where the gap is in our discussion.
But Apple has always scanned iCloud content when they gave to police and feds . Where was this same outrage then ? In this case they are moving the scanning the exact same data with device ai. But the content TO scan is the same when they shared iCloud content with police before and tomorrow when they let AI do device scan
 
Oh ok that makes more sense. If Apple was always doing this sort of scanning then what's different here? Well, on device vs on cloud, which seems like a small difference.

My understand is that Apple was not always scanning for CSAM on iCloud. They were able to decrypt iCloud for law enforcement, but they weren't simply scanning all photos. My understanding is this is a new program AND it's going to run live on our devices, and scan every photo.

But yes, if Apple had been scanning iCloud for CSAM for years then I can see your confusion around why this moment picked up.
 
  • Like
Reactions: dk001
But Apple has always scanned iCloud content when they gave to police and feds . Where was this same outrage then ? In this case they are moving the scanning the exact same data with device ai. But the content TO scan is the same when they shared iCloud content with police before and tomorrow when they let AI do device scan
Most people didn't know that Apple was scanning iCloud content. This news of scanning was new info that triggered them that Apple's "Privacy" stance was all marketing.
 
A general area of concern is the potential for abuse.

Today we're scanning photos, tomorrow we're scanning texts/Messages. Then we're scanning emails. Next we're scanning Signal. Now we're scanning everything.

Plus, lots of comparisons are made to gmail, OneDrive, etc. Those things aren't personal the same way an iPhone is. I believe a large number of people find it uncomfortable to to be associated with CSAM in the first place. This is evidenced by by the nice, friendly acronym "CSAM". It's much softer than "child porn". "You've taken my personal device and are now having a conversation about child porn with it." People don't like that.

But I think the potential for abuse, founded or not, is what people are really upset about.
 
also not everyone uses icloud. i use parts of icloud but my personal photos are only stored on my phone; i don’t upload them to icloud. photos that come through messages go through icloud for me but not anything from the camera app.
 
Oh ok that makes more sense. If Apple was always doing this sort of scanning then what's different here? Well, on device vs on cloud, which seems like a small difference.

My understand is that Apple was not always scanning for CSAM on iCloud. They were able to decrypt iCloud for law enforcement, but they weren't simply scanning all photos. My understanding is this is a new program AND it's going to run live on our devices, and scan every photo.

But yes, if Apple had been scanning iCloud for CSAM for years then I can see your confusion around why this moment picked up.
Apple was not scanning CSAM they were scanning and vetting content before they give to police and feds. e.g. they gave feds content of iphone users on jan 6. they give it to local police when warrants are servers.

Who they have NOT given to? iPHONE content locked under E2E to police such as that famous terrorism case where they said they couldnt give it because they dont have keys to unlock the secure enclave of the terrorists iphone.

CSAM scan on device will occur only when an iCLOUD photo is taken or received, because each photo taken or received is set to be uploaded to icloud so each of those photos will not be end to end encrypted. the device AI will scan the hashes just like Local siri work starting in iOS 15 and local device AI to scan your photos to do picture to text search in ioS 15. Apple only finds out if there is a BULK of hashes discovered , how many? could be 10, 30. 50 photos no one knows but its not ONE.
 
  • Like
Reactions: Jstuts5797
also not everyone uses icloud. i use parts of icloud but my personal photos are only stored on my phone; i don’t upload them to icloud. photos that come through messages go through icloud for me but not anything from the camera app.
At this point, you are safe from scanning although that may change in the near future with iMessage.
 
also not everyone uses icloud. i use parts of icloud but my personal photos are only stored on my phone; i don’t upload them to icloud. photos that come through messages go through icloud for me but not anything from the camera app.
then your phone wont scan for CSAM
 
A general area of concern is the potential for abuse.

Today we're scanning photos, tomorrow we're scanning texts/Messages. Then we're scanning emails. Next we're scanning Signal. Now we're scanning everything.

Plus, lots of comparisons are made to gmail, OneDrive, etc. Those things aren't personal the same way an iPhone is. I believe a large number of people find it uncomfortable to to be associated with CSAM in the first place. This is evidenced by by the nice, friendly acronym "CSAM". It's much softer than "child porn". "You've taken my personal device and are now having a conversation about child porn with it." People don't like that.

But I think the potential for abuse, founded or not, is what people are really upset about.
isnt the potential for abuse more with other companies because they scan photos AND hashes and they have no known human vetting, just one photo if sent by a hacker could land you in jail. with iPhone, device AI scans for hashes only and only when its a BULK of content is it sent to a human verifier who verifies. There are so many fail safes its more secure than someone cracking face id easily
 
This is a HASH example of a windows systems image file. Now tell me how you find out whats in the picture with that. Only machine AI can perform this in and instant without someone LOOKING at your photos
 

Attachments

  • 1628560951169.png
    1628560951169.png
    104.5 KB · Views: 206
  • Love
Reactions: Apple$
isnt the potential for abuse more with other companies because they scan photos AND hashes and they have no known human vetting, just one photo if sent by a hacker could land you in jail. with iPhone, device AI scans for hashes only and only when its a BULK of content is it sent to a human verifier who verifies. There are so many fail safes its more secure than someone cracking face id easily

You're just all over the place aren't you? The thread is about Apple's CSAM scanning. Not other companies. More potential for abuse on another platform does zero to mitigate potential for abuse on Apple's platform.

How about I don't want my device scanned by Apple or anyone else without my consent?

I saw a linked article earlier (forget where) that said gmail was scanning CSAM since 2014 or something, and resulted in evidence in one case. One.

What is all of this actually going to accomplish? Open the door to further privacy intrusions under the banner of "think of the children" while protecting no one from anything. CP *******s aren't sending their goods through iCloud my man.

Want to search my ****? Get a warrant.
 
Data on the iCloud server farm is encrypted, you can not pick out an individual image from an encrypted format (at least it's not supposed to work that way😄 ). The AI on the device hashes images shared with iCloud and then compares those hashes to the CSAM hashes, if the hashes match then it sends a safety voucher up for review. If there's no match, it does nothing with the hashes/images. They would have to remove the in-transit encryption.
 
  • Like
Reactions: addamas
I do think that Apple will have a hard time keeping this out of government's hands (and, if they do have to release the keys to the kingdom, it will be in secret courts in all likelihood).

Beyond that, I do think we have far too much violence against women and children in our society. Me, I'd like to see all the perpetrators of those crimes put on an island somewhere and we set off a nuke ever few years for population control.

So, what Apple is doing is a lot kinder than what I would implement. :D
 
  • Like
Reactions: mavis
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.