Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why would Apple open this floodgate? Is the government requesting this? Is a new upcoming business related to it? Why do they offend all of their customers by searching everybody's phone for what only some idiots collect at all? This is so out of proportion I still can't believe they do it.
They might be announcing end to end encryption for iCloud Photos. They can't scan encrypted photos in the cloud, so it would have to be done BEFORE they're encrypted. Right now, they have the keys to everyone's photo libraries.
 
Yes… and you’ve finally reached the point.

I am happy to consent to iCloud scanning. I do not want to be put into a position where I must consent to searching on my personal device, or stop using my personal device.

There is no path forward to continue using my personal property unless I consent to warrantless searches.
Stay on iOS 14. iOS 15 is optional.
 
  • Haha
Reactions: Pummers
But they can do all of that now. What's stopping them?
They don't have the infrastructure/software to do local scanning, that's the whole point of the criticism. Not sure how many times people have tried to explain it.

That is the red line.

Anything people have shared onto a social platform, the owner of that network has the right, if not moral obligation, to scan their network for 'bad things'* and report it. No one has argued against this.

The fact that no company has done local scanning (for bad things) before Apple's announcement should give you an indication of how much of a red line it has been. Facebook, for all of it's creepy spying, didn't dare do it with any of their apps. Nor has Google, or Microsoft.

Everyone that is saying Apple (and everyone else) has been spying on you all along, it's not a big deal, I will ask: why did Apple even bother announcing this with a dog and pony show? It should be a non-Announcement. Why did they announce this separately from the rest of the iOS features during the betas and WWDC? Maybe because this is a big deal?

Apple totally dropped the ball on CSAM, given their paltry number of reports to the NCMEC and authorities, when compared to the other tech companies. In order to compensate for this they implement a system that totally crosses the barrier that no other tech company has dared. I guess 2 trillions dollars of net worth will give you a boldness no other company has.

Now that Apple has done this, and since there is a distinct lack of outrage from a segment of the public, this is a green light to all other providers to perform local scanning for 'bad things'.

No one here has given a good reason why Apple needed to implement their CSAM scanner in this manner. No other tech company needed to do it this way (local scanning). There's all sorts of ethical and legal barriers that needed open discussion before going this way, yet Apple decided unilaterally to open Pandora's box.

It's not even 'too late for me', I'm not even in the USA. But once it is turned on, it will come to my country as it did yours. I've been here (on a Mac) since 1985, probably longer than most everyone else, and I'm done...not going to throw my stuff into a fire pit in outrage, but no updates to this literal spyware and as it cycles out no more Apple products will be coming into my house.

*bad things is not to make light of CSAM, it's just to define it in broader terms as any criminal photo as defined by each nation's laws.
 
You are so close to reaching the crux of the matter, but you slip at the end. Turning off iCloud Photo Library does not prevent the surveillance software from being installed on your personal device. You should know this if you read the press release, so you’re either ill-informed, or you’re being intentionally misleading.

So you're afraid of the software can be run later either intentionally or by a bug since it's already installed?

But there are already such software on the iPhone today:

* Software which can wipe your entire device (not on by default, but I think Apple can override it)
* Software which can delete, on Apple's command, every app you have installed, on Apple's command (activated by default, no way to turn it off)
* Software which can stop any app from launching on Apple's command (activated by default, no way to turn it off)
* Software which can copy the entire device or at least all the user data and send it to Apple (not activated by default)

The last one is called iCloud backup.

Here is how Apple can easily misuse iCloud Backup way more than the CSAM Detection system.

1. Increase iCloud size quota for users (secretly)
2. Turn on iCloud backup on the device secretly without showing it in the user interface
3. iCloud Backup will, without any code changes, back almost all of the user data to iCloud, secretly
4. Decrypt the backup in iCloud
5. Scan everything or give it to a government entity unencrypted

And yet, I haven't seen any protests against this type of software already being installed on iPhones for many years.

Wouldn't oppressive governments rather use this method which gives them everything?
 
@Pummers

Great points..

Unless I've missed, even Google isn't doing local scanning like this on their own Pixel devices.

And that is GOOGLE!
(who doesn't miss a chance for data collection -- EVER)
 
  • Like
Reactions: briko
So you're afraid of the software can be run later either intentionally or by a bug since it's already installed?

But there are already such software on the iPhone today:

* Software which can wipe your entire device (not on by default, but I think Apple can override it)
* Software which can delete, on Apple's command, every app you have installed, on Apple's command (activated by default, no way to turn it off)
* Software which can stop any app from launching on Apple's command (activated by default, no way to turn it off)
* Software which can copy the entire device or at least all the user data and send it to Apple (not activated by default)

The last one is called iCloud backup.

Here is how Apple can easily misuse iCloud Backup way more than the CSAM Detection system.

1. Increase iCloud size quota for users (secretly)
2. Turn on iCloud backup on the device secretly without showing it in the user interface
3. iCloud Backup will, without any code changes, back almost all of the user data to iCloud, secretly
4. Decrypt the backup in iCloud
5. Scan everything or give it to a government entity unencrypted

And yet, I haven't seen any protests against this type of software already being installed on iPhones for many years.

Wouldn't oppressive governments rather use this method which gives them everything?
None of those software/issues were designed with the express purpose of snooping directly on my phone and calling the cops on me, though.
 
  • Like
Reactions: turbineseaplane
I would start by adapting the CSAM scanning to flag popular images that specific terrorists would share, memes about the creating war in the West, executions, etc. Those can be hashed just as easily as any CSAM. Instead of sending it to the CSAM Apple team, it is vouchered the same way to an Apple Anti-terror (iTerror™) squad, who would go to FISA court to get warrants to build a network of who received those messages and images once the specific account hits a set score.

Then all of those suspects' devices would receive hashes to search for more photos, and the neural hashing algorithm expanded to be more directed towards the user generated images instead of the flagged hashes only.

If it's common to share the same exact pictures or there derivates among terrorist and the same pictures are not shared in larger number by the general populace, the CSAM detection system would work. But it would require also the US government to know in advance what kind of "iconic" pictures each terrorist group "collects".

Terrorists are now warned and should not share the same pictures.

But wouldn't it be more effective to use the new AI software in Messages or the photo recognition algorithms in Photos? They are much better suited to give you hit on general categories such as "guns in desert", "woman in burka", "white Toyota pickup truck", "muslim prayer". Combined with location data it seems to be such a powerful tool.

To me, the CSAM detection tool seems so ineffective compared to other methods, so why use them, if you have the power to force Apple to do your bidding?
 
None of those software/issues were designed with the express purpose of snooping directly on my phone and calling the cops on me, though.

But they could be misused to do so or force Apple to do so.

Isn't one off the arguments that even if the software isn't harmful today, it could become so in the future, therefore I don't want it on my device?

I gave you examples of software which can be considered harmful today already being installed on your device and also how one of them could easily be misused in the future.
 
If it's common to share the same exact pictures or there derivates among terrorist and the same pictures are not shared in larger number by the general populace, the CSAM detection system would work. But it would require also the US government to know in advance what kind of "iconic" pictures each terrorist group "collects".

Terrorists are now warned and should not share the same pictures.

But wouldn't it be more effective to use the new AI software in Messages or the photo recognition algorithms in Photos? They are much better suited to give you hit on general categories such as "guns in desert", "woman in burka", "white Toyota pickup truck", "muslim prayer". Combined with location data it seems to be such a powerful tool.

To me, the CSAM detection tool seems so ineffective compared to other methods, so why use them, if you have the power to force Apple to do your bidding?
Sure, they might be warned about this kind of activity but it's not common for people to violate good IT security practices. For all of their violence, terrorists likely react like most people too.

If everyone followed IT protocols, we wouldn't have ransomware for BTC as common as it is, or email spam would be ineffectual.
 
But they could be misused to do so or force Apple to do so.

Isn't one off the arguments that even if the software isn't harmful today, it could become so in the future, therefore I don't want it on my device?

I gave you examples of software which can be considered harmful today already being installed on your device and also how one of them could easily be misused in the future.
Again, the purpose of every one of those examples you cited was for utility (ie. usefulness) to the owner of the device.

If this CSAM system actually flagged bad photos for the USER before they got uploaded, and asked them if they wanted to purge said photos or send them for examination/review that is another thing entirely.
 
  • Like
Reactions: turbineseaplane
Provide a citation

From Apple’s own support page…and it has been doing this for about 5 years now (just better now obviously as tech has improved).

Moments: Search for an event, like a concert you attended or a trip you took. Photos uses the time and location of your photos along with online event listings to find matching photos.
People: Find photos in your library of a specific person or a group of people. Just keep names and faces organized in your People album.
Places: See your photos and videos on a map in the Places section. Or type a location name in the Search bar to see photos and videos from that place.
Categories: Photos recognizes scenes, objects, and types of locations. Search for a term like "lake" and select a result to see photos that match.
The Search tab also suggests moments, people, places, categories, and groups for you to search. Tap a suggested search, such as One Year Ago or Animals, to explore your photos.
When you search your photos, the face recognition, and scene and object detection are done completely on your device. Learn more about photos and your privacy.


The last part is basically what they are doing now as well. They have on-device parameters that recognize objects in your photos. No different than the on-device database hashes they will be adding.

The difference? The older data points just help when you are searching for the stuff listed above. The new one specifically mat-res photos that match the database hashes and then flags them when uploaded to iCloud. Have enough of them in your library and Apple checks to verify. If you uploaded known child porn to your iCloud account, the authorities are contacted.


So, again, why wasn’t anyone freaking out about this existing (for 5 years now) “back door” way of scanning images on your phone? What would have stopped them previously from adding the ability to recognize “trump” or “pink triangles” or anything else they could think of? Why do all of these lame conspiracy theories have more validation now?

And still, why would the government even NEED this?? There are easier ways to scan your phone and especially iCloud by hacking into them instead of involving hard coded data being added to a phone by Apple (not the government…Apple does it!)



EDIT: When you click on the privacy link:

Photos lets you choose who has the full picture.

The Photos app uses machine learning to organize photos right on your device. So you don’t need to share them with Apple or anyone else.

More about Photos
Your photo and video albums are full of precious moments, friends, and your favorite things. Apple devices are designed to give you control over those memories.
Photos is also designed so that the face recognition and scene and object detection — which power features like For You, Memories, Sharing Suggestions, and the People album — happen on device instead of in the cloud. In fact, the A13 and A14 Bionic chips perform over 100 billion operations per photo to recognize faces and places without ever leaving your device. And when apps request access to your photos, you can share just the images you want — not your entire library.
 
@Pummers

Great points..

Unless I've missed, even Google isn't doing local scanning like this on their own Pixel devices.

And that is GOOGLE!
(who doesn't miss a chance for data collection -- EVER)

Sigh…please read what I wrote (and copied and pasted right from Apple’s site) above.

Apple is BEHIND those companies when it comes to this stuff BERCAUSE of their high standards when it comes to privacy.

You guys will complain about how crappy Siri is or searching on other browsers or Maps while touting how great Google is with their data.

It is BECAUSE they have been scanning your phone for that info. Their lack of privacy rules has made their products so great when it comes to info like that.
 
You have to have iCloud Photo Library turned on and unknown number of matches. Apple doesn't at any time know how many matches you have until you reach the threshold.

Then they can read the security vouchers which were created when a match occurred and included there is a derivative of your photo from your device. They will then only have access to the photos which were matched and flagg.

This will also allow them to do end-to-end encryption of iCloud at a later point.

I have learned far too much on what Apple does and does not. What I am wondering is if Apple employees are doing a phot to photo comparison or something else.

We can only hope that Apple encrypts iCloud with you having the key though that may be very problematic for those who forget there “password”.
 
i think you are perhaps underestimating the significance of this decision, it is proof of concept to the whole world and we now will see apple under intense pressure to scan for who-knows-what on people's phones, governments may well order apple in secret to load databases of all kinds on phones

plus, you have gotten it wrong: apple IS spying on me now (or will be if i download os15)

finally, why are they not doing this another way, when they can accomplish the same objective by doing this off peoples phones, what are they up ? are they being forced to do this ? by who ?

That is my single biggest concern at this point: Apple desigb appears to have taken the reason they used during the SB attack off the table To not comply with the FBI subpeonas.
 
Apple will do a human review which will catch almost all false positives. Apple will also not be turning the material over to law enforcement agencies but to NCMEC.

If Apple believes it's child pornography after this human review they have to, by law, report it to NCMEC.

Providers like Apple has a high level of immunity in this process.

From what I have found out, NCMEC will turn it over to law enforcement authorities once confirmed.
 
  • Like
Reactions: Pummers
If this CSAM system actually flagged bad photos for the USER before they got uploaded, and asked them if they wanted to purge said photos or send them for examination/review that is another thing entirely.

That's a really interesting point and distinction -- that is exactly how that should work if "user privacy on device" is really a guiding principle

The fact that it doesn't really drives home how much it's built to "catch people"
Just gross to design and install something like that device side
 
That's a really interesting point and distinction -- that is exactly how that should work if "user privacy on device" is really a guiding principle

The fact that it doesn't really drives home how much it's built to "catch people"
Just gross to design and install something like that device side

Client side scanning has been a wish item for the EU for a while and a proposed solution.
 
That's the only possible explanation.

Maybe it's not ready yet.
Possible. What if it turns out not to be the case.

What is the explanation when we can scratch the only possible explanation?

Not trying to be snippy - serious question (I don‘t buy the E2EE narrative just yet tbh) - because irrespective of tech and implementation disputes, what still is a massive mystery to me is the question: WHY?
 
Possible. What if it turns out not to be the case.

What is the explanation when we can scratch the only possible explanation?

Not trying to be snippy - serious question (I don‘t buy the E2EE narrative just yet tbh) - because irrespective of tech and implementation disputes, that still puzzles me is: WHY?

I don't either until they actually do it (or at the very minimum announce the plan to get to that).

They need to totally shelve the CSAM stuff as discussed thus far until E2EE is ready and it can be a cohesive plan and announcement, etc

Just jamming in CSAM on its own immediately and justifiably makes people very worried
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.