Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Can someone explain why Apple thinks it's their job to make sure you're not a pedophile?

They're literally going to spy on you to make sure you're not as pedophile.

How is this within the scope of their business model?

I seriously cannot get my head around this.

It reminds me of the preachers who keep telling you being gay is bad, while they're secretly cheating on their wife with Geoffrey the stable boy.

My take is that these moves continue to be in line with Apple’s push to protect both users and their data.

As far as I can see, iMessage scanning and siri search enhancements don’t seem to have been met with any degree of controversy. I feel that alerting parents to when their children (under 12) open potentially explicit photos is something that needs to become common practice across the industry. My take is that it will become one of those “why did no one to think of doing it before Apple” kind of scenarios.

As for scanning your photos for CSAM material, what Apple has done here is come up with a way to find CSAM images being uploaded to iCloud photos without actually dissecting and probing photo library’s in a manner that voids privacy. I suspect this will be a prerequisite towards offering fully encrypted iCloud storage, which the FBI has been steadfastly resistant to the idea of up till now.

So rather than asking “why does Apple care whether I am a pedophile or not”, I see it more as “What Apple is doing to be able to move on to the next step of further protecting my data while addressing the concerns of law enforcement”.

I don’t have CSAM on my iOS devices, I never will, and if this is a prerequisite to being able to encrypt my photos and cloud backups, then I say go for it. It’s pure upside for me.
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.

Yes, and the other companies are scanning the content uploaded to their cloud servers, not on your personal devices.

Cloud servers are part of their infrastructure, and they are free to scan them however they see fit. They aren’t conducting unwarranted searches on your personal property. There is a difference.

Apple is doing it the other way around. They are scanning on your device, before anything is sent to the cloud. Are you even following along?
 
Last edited:
I don't believe Apple needs to scan on device. They can catch the perps by scanning the photos in the cloud.
That requires their servers to be able to decrypt and inspect your data while it's being transmitted to (or after its been written) on their "cloud" servers, no? If they can do that, then your cloud data isn't end-to-end encrypted.
 
That requires their servers to be able to decrypt and inspect your data while it's being transmitted (or already written) on their "cloud" servers, no? If they can do that, then your cloud data isn't end-to-end encrypted.
Anyone using iCloud Photos is automatically giving Apple the decryption key. In my opinion, Apple has the right to scan for CSAM on their servers and I don't have a problem with that, even though I don't use iCloud photos.
 
But as others have said... all of the big companies are doing similar things. So I dunno.

No, the other big companies are searching content uploaded to their cloud services. It’s different. The servers are part of their infrastructure.

This is not a reason to normalize unwarranted searches taking place on your personal property. Apple is conducting the search ON DEVICE that you own. We have the presumption of innocence.

For the record, I’m glad they are scanning for this content, but I think they are going about it the wrong way. If they WERE doing it like the other big tech companies, then I don’t think this would be such an issue.
 
No, the other big companies are searching content uploaded to their cloud services. It’s different. The servers are part of their infrastructure.

This is not a reason to normalize unwarranted searches taking place on your personal property. Apple is conducting the search ON DEVICE that you own. We have the presumption of innocence.

For the record, I’m glad they are scanning for this content, but I think they are going about it the wrong way. If they WERE doing it like the other bug tech companies, then I don’t think this would be such an issue.

You're gonna get jumped all over for this, here's the scoop.

Assuming you're in America, the rights you're referring to only protect you from government searches. Apple is not the government. Therefore, Apple is not "conducting a search" whether it's warranted or not. You have no presumption of innocence by Apple. You have no right to free speech on Apple's platform, etc.
 
Last edited by a moderator:
Have you thought about all the nefarious things Apple can do with the Camera app and the Photos app together with your location data?

That's a much better technology for looking for people performing certain kinds of activities.
Yes now imagine with a bugged algorithm spying my messages and photos.
 
You're gonna get jumped all over for this,

Assuming you're in America, the rights you're referring to only protect you from government searches. Apple is not the government. Therefore, Apple is not "conducting a search" whether it's warranted or not. You have no presumption of innocence by Apple. You have no right to free speech on Apple's platform, etc.

Yes, and I fully acknowledge that the constitution doesn’t protect us from the tyranny of corporations. But I still think we should use our voices to decry this kind of surveillance.

Apple can build all the surveillance features they want into their cloud servers, but I don’t want them turning my phone into a surveillance device.
 
It seems you don't know what's going on here. The database is already under government ownership and control and has legal regulations on it! If FBI did what you suggested it would probably be a federal crime.

Also, there is an much easier way to catch Hong Kong protestors than this system: Photos app together with location data.
You’re right, the government will never and has never stolen our data or spyed on us.
 
Yes, and I fully acknowledge that the constitution doesn’t protect us from the tyranny of corporations. But I still think we should use our voices to decry this kind of surveillance.

Apple can build all the surveillance features they want into their cloud servers, but I don’t want them turning my phone into a surveillance device.

Yes, right there with you! We should decry this kind of surveillance, get policy changes, ditch Apple products, and find solutions. I've spent the afternoon learning about how to run Linux, and other phone options like Pinephone, Fairphone, and Librem.
 
You do know that your Photos analyses every Photo you take? All your photos are analysed so heavily and resource intensive that the iPhone only does it while connected to power for the most part.
Yes i do. However i dont use any of the advertised features ( facec, places etc). So that limit this. Also thatbis different story completly.
 
  • Like
Reactions: briko
One in a trillion related to the CSAM API as tested, this version, as asserted by Apple after private review.
iOS is not secure, certainly not to a one-in-a-trillion level.

Now Apple has created a way to target any iOS user for automatic referral to law enforcement. Bad actor plants photo in your phone, rest is a fait accompli. You probably have thousands of photos, do you really scroll through all of them frequently?

I really don't care about the technical implementation of CSAM detection. It works or it doesn't. If it works, this creates a huge incentive for targeted phone hacking.

Unless Apple is going to ensure us that every bug related to iOS, every malicious text escalation, has been permanently fixed.
How many gmail accounts , how many flickr drives, how many OneDrive, How many Amazon drives, How many Macrumors private messages, How many twitter DMs. all of which are likely scanned for CSAM have been targetted by bad actors since 2004? We are dealing with could be and might be here. why not give an estimate?
 
I don't believe Apple needs to scan on device. They can catch the perps by scanning the photos in the cloud.
exactly why does it matter where the hash scan occurs when the data available to apple TO scan is the exact same as iCloud data? APpleis not scanning photos they are scanning hashes of data
 
Certain government entities could force Apple today to this with existing stuff in the Camera and Photos app.
How do you know Apple aren't already using face recognition in the Photos app to help China catching Hong Kong demonstrants?

Never claimed that and that never was the topic of discussion. Also, your approach is a zero-sum game. You can say that about anything and anyone: How do you know all Linux kernels are not secretly infected by a backdoor, that isn't discovered in the library? Trust starts somewhere, and that is an individual matter. In this case, most people tend to agree the line has been crossed, if you think it was long ago - okay then, that changes nothing?
 
  • Like
Reactions: Euronimus Sanchez
exactly why does it matter where the hash scan occurs when the data available to apple TO scan is the exact same as iCloud data? APpleis not scanning photos they are scanning hashes of data
Apple is processing the photos. That's how they generate the hashes of the data. They take the photo, hash it with their CSAM tool, and then compare the hash to the CSAM database.

"Why does it matter where the hash scan occurs?"
Good question, I'm wondering the same thing. Scanning on the device seems worse than scanning in the cloud but doesn't seem like I can explain it well.

1. If it's on iCloud, I have less of an expectation of protection. I'm storing the photos somewhere else that I do not fully control.
2. If it's on my device, it's using my resources, my device to do this scanning that is of no value to me. They're basically stealing from me.

That all doesn't seem super coherent to me, but I'm interested in improving.

What do you think and why?
 
Apple is processing the photos. That's how they generate the hashes of the data. They take the photo, hash it with their CSAM tool, and then compare the hash to the CSAM database.

"Why does it matter where the hash scan occurs?"
Good question, I'm wondering the same thing. Scanning on the device seems worse than scanning in the cloud but doesn't seem like I can explain it well.

1. If it's on iCloud, I have less of an expectation of protection. I'm storing the photos somewhere else that I do not fully control.
2. If it's on my device, it's using my resources, my device to do this scanning that is of no value to me. They're basically stealing from me.

That all doesn't seem super coherent to me, but I'm interested in improving.

What do you think and why?
Apple is not processing hashes, the device AI is. just like Siri doing work locally starting in iOS 15. we Apple is processing questions we ask siri we say Siri is doing it locally, so why cant the same principle apply to the photo hash scanning.

If Apple scans on iCloud servers it is the same scan across the ENTIRE icloud library every minute of every day . it will rescan a lot of content you change your update on icloud servers. It will scan zip files too. It will scan PDFs too if done on the cloud. And in the case of iCLOUD it WILL be Apple scanning, not locally but on their servers.

On device local AI scanning removes that because it scans only what is needed and only ONCE when a photo is added onto iCloud drive. the scanning is faster and more efficient and because the scan is not done by Apple on their servers it is only done on the privacy of YOUR device...UNLESS you have bulk of CSAM which is the ONLY time Apple finds out for human verification
 
  • Haha
Reactions: dk001
exactly why does it matter where the hash scan occurs when the data available to apple TO scan is the exact same as iCloud data? APpleis not scanning photos they are scanning hashes of dat
Apple is processing the photos. That's how they generate the hashes of the data. They take the photo, hash it with their CSAM tool, and then compare the hash to the CSAM database.

"Why does it matter where the hash scan occurs?"
Good question, I'm wondering the same thing. Scanning on the device seems worse than scanning in the cloud but doesn't seem like I can explain it well.

1. If it's on iCloud, I have less of an expectation of protection. I'm storing the photos somewhere else that I do not fully control.
2. If it's on my device, it's using my resources, my device to do this scanning that is of no value to me. They're basically stealing from me.

That all doesn't seem super coherent to me, but I'm interested in improving.

What do you think and why?
I think people need to realize, that if they are going to use iCloud Photos, there is no expectation of privacy. Apple is responsible for what is put on their servers. Once it is uploaded, you don't really have any control over the media outside of being able to delete the photo. And even if one does that, there are usually other copies made during the redundant backup process.
 
So say the guys who are already negotiating user privacy with the Chinese government. Pandora's box has been opened, Cupertino is only concerned about the public relations fire they themselves unleashed.
 
  • Like
Reactions: 09872738
You're gonna get jumped all over for this, here's the scoop.

Assuming you're in America, the rights you're referring to only protect you from government searches. Apple is not the government. Therefore, Apple is not "conducting a search" whether it's warranted or not. You have no presumption of innocence by Apple. You have no right to free speech on Apple's platform, etc.

True however it becomes an issue if what Apple finds is used to prosecute. Then it becomes an issue.
 
Serious question: doesn’t everyone read the terms and service agreement when using or installing software? I bet you agree to all kinds of stuff that you normally wouldn’t haha
 
Turn off iCloud Photo Library
Use Google Photos

Your photos might be scanned in the cloud but no on-device scanning.

All other solutions which people are preposing here will make you unhappy. This one might actually work for you.
Nobody is forcing you to update to iOS 15 either.
 
Serious question: doesn’t everyone read the terms and service agreement when using or installing software? I bet you agree to all kinds of stuff that you normally wouldn’t haha

I have. I even had a bud who is a lawyer read parts I was confused on. He scratched his head at some of the stuff in there. There are parts that are still confusing as all get out or flat out don’t make sense.

I have not read it since iOS 12 so it has been a couple of year.. I highly suspect is has not become any more transparent to the consumer.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.