Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Possible. What if it turns out not to be the case.

What is the explanation when we can scratch the only possible explanation?

Not trying to be snippy - serious question (I don‘t buy the E2EE narrative just yet tbh) - because irrespective of tech and implementation disputes, what still is a massive mystery to me is the question: WHY?

That is the one question I have not found any kind of an answer to during my search into this.
Why client side?
 
That is the one question I have not found any kind of an answer to during my search into this.
Why client side?

It needs to be discussed and worked through transparently before implementation.

Really, even if people are ok with this on-device scanning (which I'm really still "not"), they should all be for this being fleshed out over a longer time period with more discussion and debate and transperancy.

A corporation, particularly in Apple's position, should not be jamming something like this in.

It feels downright devious to just now talk about it merely a month or so from iOS 15 and Monterey also.

Notice this was TOTALLY absent from the discussion at WWDC?
They abso-frickin'-lutely knew they were going to be doing it then too..

This is wrong.
This is rushed.

Nobody should trust the way they've conducted the release of this information, the timing and necessarily even the intent as a result.
 
It needs to be discussed and worked through transparently before implementation.

Really, even if people are ok with this on-device scanning (which I'm really still "not"), they should all be for this being fleshed out over a longer time period with more discussion and debate and transperancy.

A corporation, particularly in Apple's position, should not be jamming something like this in.

It feels downright devious to just now talk about it merely a month or so from iOS 15 and Monterey also.

Notice this was TOTALLY absent from the discussion at WWDC?
They abso-frickin'-lutely knew they were going to be doing it then too..

This is wrong.
This is rushed.

Nobody should trust the way they've conducted the release of this information, the timing and necessarily even the intent as a result.
I’m convinced that you and others just choose not to read the facts about this tech and that it has in reality been on your phone in one form or another for 5 years now.
 
I’m convinced that you and others just choose not to read the facts about this tech and that it has in reality been on your phone in one form or another for 5 years now.

I would suspect I am likely one of the most well read on this topic here over the last few days. I can appreciate what Apple has built and looks like they are trying to do. What I have not found in all the pages, audio, and video, is WHY Apple chose to do this client side. This design is not a new solution. I think a rational explanation from Apple on this would be a benefit for all.

I am in total agreement with @turbineseaplane and others on this; there should be some additional discussion before this is rolled out.
 
The question I have is, if I already don't use iCloud Photos, and I don't have any kids, then this theoretically shouldn't affect me, right?

I certainly do not have any CSAM materials, but I am still concerned about surveillance.

Yes I know, "If they can surveil this today, then tomorrow blah blah blah." Yes, I know, "Frog in a pot blah blah blah." But let's just talk about today for the moment. It doesn't affect me, right?
 
The question I have is, if I already don't use iCloud Photos, and I don't have any kids, then this theoretically shouldn't affect me, right?

I certainly do not have any CSAM materials, but I am still concerned about surveillance.

Yes I know, "If they can surveil this today, then tomorrow blah blah blah." Yes, I know, "Frog in a pot blah blah blah." But let's just talk about today for the moment. It doesn't affect me, right?

Directly affect you today? Not in anything I have found. Software bugs tbd.
 
I would suspect I am likely one of the most well read on this topic here over the last few days. I can appreciate what Apple has built and looks like they are trying to do. What I have not found in all the pages, audio, and video, is WHY Apple chose to do this client side. This design is not a new solution. I think a rational explanation from Apple on this would be a benefit for all.

I am in total agreement with @turbineseaplane and others on this; there should be some additional discussion before this is rolled out.
And maybe the "why" is much simpler than anyone wants to admit.

* The tech is already on your phone to do this and they have been doing this for 5 years.
* It is more private being done on your device rather than scanning every single photo you might upload to iCloud.
* You can turn off iCloud Photos and it never scans anything on your device...FYI, Apple will scan your photos on device for spotlight search and other features as I mentioned above.
* It PREVENTS people from creating code or using the code for nefarious reasons...Apple hard codes the image database info into iOS where no one has access except for them. And because they have been doing this for 5 years, they know the process is secure.

There may be a thread on here from a coupe of years ago that I missed, but where is the thread stating, "They're going to scan my photos to tell me if there is a cat or a dog in a pic??? What if someone uses that tech and uploads a setting to scan for pics of my private parts?? AAAAAAAARRRRRGGGGHHH!!!"
 
  • Like
Reactions: artfossil
1. Any narrative that begins with "the children," has nothing to do with CP and will have effectively 0 impact on the problem they're claiming to address. It's just an emotional button to push because pretty much nobody supports abusing children.
2. China has Apple's balls in a vice and they already caved in years ago. They're just rolling out a public service announcement to the governments of planet Earth that they're open for business. Replace CSAM database, with anything you want.
3. This makes Apple pretty much no different than every other tech company, except for the fact that Apple is in a unique position because only 2 months ago they were still the "we care about YOUR privacy" company. https://forums.macrumors.com/thread...ge-behind-csam-scanning.2307045/post-30161579
They used to: https://www.apple.com/customer-letter/
But as of last year, they caved in: https://www.reuters.com/article/us-...ps-after-fbi-complained-sources-idUSKBN1ZK1CT
Welcome to 1984 comrade, please do not engage in wrongthink, we have always been at war with Oceania. Your phone, your smart home, and your connected car, are all ensuring your continued compliance with what's best for the CIA, I meant to say: United States. We appreciate your cooperation.
 
The question I have is, if I already don't use iCloud Photos, and I don't have any kids, then this theoretically shouldn't affect me, right?

I certainly do not have any CSAM materials, but I am still concerned about surveillance.

Yes I know, "If they can surveil this today, then tomorrow blah blah blah." Yes, I know, "Frog in a pot blah blah blah." But let's just talk about today for the moment. It doesn't affect me, right?
Doesn’t matter if you have kids or not. The system only looks for images that are already known to be CSAM. It won’t find new images.
 
Doesn’t matter if you have kids or not. The system only looks for images that are already known to be CSAM. It won’t find new images.
I meant that I won’t be subjected to the new iMessage features targeted at minors.
 
And maybe the "why" is much simpler than anyone wants to admit.

* The tech is already on your phone to do this and they have been doing this for 5 years.
* It is more private being done on your device rather than scanning every single photo you might upload to iCloud.
* You can turn off iCloud Photos and it never scans anything on your device...FYI, Apple will scan your photos on device for spotlight search and other features as I mentioned above.
* It PREVENTS people from creating code or using the code for nefarious reasons...Apple hard codes the image database info into iOS where no one has access except for them. And because they have been doing this for 5 years, they know the process is secure.

There may be a thread on here from a coupe of years ago that I missed, but where is the thread stating, "They're going to scan my photos to tell me if there is a cat or a dog in a pic??? What if someone uses that tech and uploads a setting to scan for pics of my private parts?? AAAAAAAARRRRRGGGGHHH!!!"

Sorry but I feel you are bit off kilter on this and keep drifting from the point. Maybe you missed it.
Client side searching has been around for a long time. I seriously doubt Apple is doing this today. They are reporting far to few CSAM violations for that. All they apparently do today is some Mail scans and other checks when issued a subpeona. The function to take a face, a leaf, a flower and do a “Google” search is far different than this. I suspect you are quite well aware of this.

It isn’t the solution, rather the why do it client side? Despite the potential for misuse, there are easier methods to accomplish this. After all I have reviewed and learned, that single piece stands out. Stands out to a lot of others also.
 
Its because Apple is putting this on the device, scan all you want in the icloud. I have seen many pro apple supporters say "turn off iCloud" nothing to worry about. Then what the he!! is the point? Impose on the innocent? When this gets to the courts, it will be found as an illegal search.

iPhone 13 will be out this fall, hurry up and get your new iSpyware/s.
 
1. Any narrative that begins with "the children," has nothing to do with CP and will have effectively 0 impact on the problem they're claiming to address. It's just an emotional button to push because pretty much nobody supports abusing children.
2. China has Apple's balls in a vice and they already caved in years ago. They're just rolling out a public service announcement to the governments of planet Earth that they're open for business. Replace CSAM database, with anything you want.
3. This makes Apple pretty much no different than every other tech company, except for the fact that Apple is in a unique position because only 2 months ago they were still the "we care about YOUR privacy" company. https://forums.macrumors.com/thread...ge-behind-csam-scanning.2307045/post-30161579
They used to: https://www.apple.com/customer-letter/
But as of last year, they caved in: https://www.reuters.com/article/us-...ps-after-fbi-complained-sources-idUSKBN1ZK1CT
Welcome to 1984 comrade, please do not engage in wrongthink, we have always been at war with Oceania. Your phone, your smart home, and your connected car, are all ensuring your continued compliance with what's best for the CIA, I meant to say: United States. We appreciate your cooperation.
so when Apple said last week explicitly they will not expand client side scanning to more categories, do you believe they are lying?
 
so when Apple said last week explicitly they will not expand client side scanning to more categories, do you believe they are lying?
Yes. See also: marketing. It’s just manipulating people’s emotions to make a particularly noxious turn of events a more palatable narrative. Please get back to me as announced scope and feature creep grows.
 
It isn’t the solution, rather the why do it client side? Despite the potential for misuse, there are easier methods to accomplish this. After all I have reviewed and learned, that single piece stands out. Stands out to a lot of others also.

I agree here - why do it client side? If Apple doesn't trust its user base, why should we trust Apple to not abuse the client side NeuralHash engine in the future?
 
I agree here - why do it client side? If Apple doesn't trust its user base, why should we trust Apple to not abuse the client side NeuralHash engine in the future?
Because if Apple does it server side they will scan the entire iCloud libraries of every icloud user across its entirety as opposed to per icloud folder scan on device. On Device, the AI is doing the scan based off db on your OS. On server, Apple is doing the scan based off a rolling DB on Apple's servers
 
Because if Apple does it server side they will scan the entire iCloud libraries of every icloud user across its entirety as opposed to per icloud folder scan on device. On Device, the AI is doing the scan based off db on your OS. On server, Apple is doing the scan based off a rolling DB on Apple's servers

Apple could do something like a mid point server that sits between your device upload and the iCloud. Scan using the same rules. I am sure there are other options. I want to find a paper from the EU that spelled out a number of different solutions for this I ran across. Client side scanning was just one.

There are other options that keep off device side.
 
I’ve been contemplating just switching off iCloud photos and storing everything on the device, but the last thing I want is to accidentally lose everything. I hope Apple changes their mind.
Back up to a NAS as well as putting your photos in a Cryptomator folder in iCloud. Both prevent Apple from getting access and you don't lose your photos.
 
  • Like
Reactions: dannyyankou
Sigh…please read what I wrote (and copied and pasted right from Apple’s site) above.

Apple is BEHIND those companies when it comes to this stuff BERCAUSE of their high standards when it comes to privacy.

You guys will complain about how crappy Siri is or searching on other browsers or Maps while touting how great Google is with their data.

It is BECAUSE they have been scanning your phone for that info. Their lack of privacy rules has made their products so great when it comes to info like that.

I’m convinced that you and others just choose not to read the facts about this tech and that it has in reality been on your phone in one form or another for 5 years now.
I don't understand how you can believe both statements to be accurate.

The tech has been on our phones for years to profile everyone and yet it's still so crappy?
Somehow we had privacy and now we don't because we never did?

I accepted that Maps and Siri were half brain-dead because of the privacy issues, but now I will not accept any of it because Apple has decided to invade my privacy in a serious breach.

So now you want me to accept that CSAM won't be half brain dead because 'let's trust Apple?'
Like I said before, when you have a quasi-governmental agency reporting you to the Feds, backed by a 2 trillion dollar company's lawyers, do you think that your single $200/hr lawyer is going to be an effective shield? You want the government to do it's job, but you need them to be accurate, right, and fair. It's hard to be sure that they will be, given the serious invasion of personal privacy delivered on a tight timeline without any oversight, transparency or review.

Technically speaking, if Apple wanted this great tech to work for iCloud only, why didn't they force EVERY shared photo to iCloud be submitted with vouchers, and then use their own servers to scan for a CSAM match and if there is no match, delete the bloody voucher and move on? This way users have their privacy kept on their own phones, and any CSAM database doesn't have to cross into the user's operating system.
 
  • Like
Reactions: turbineseaplane
You have to have iCloud Photo Library turned on and unknown number of matches. Apple doesn't at any time know how many matches you have until you reach the threshold.

Then they can read the security vouchers which were created when a match occurred and included there is a derivative of your photo from your device. They will then only have access to the photos which were matched and flagg.

This will also allow them to do end-to-end encryption of iCloud at a later point.
iCloud truly being end to end encrypted is wishful thinking. Apple won't do that because part of their company policy is working with the powers that be for the greater good etc.

Someone on the internet posited that idea and I think many here are trying to convince themselves that that is Apple's real goal. It's not going to happen. If Apple really wanted to do that, they could have done it already. The tech is there. They stopped with the idea in 2018.
 
  • Like
Reactions: Pummers and dk001
The thing is. Apple can't allow CSAM images on their servers so either they have the encryption key for all of your photos, or they do it the new way where they have access to NONE of your photos UNTIL there's MULTIPLE matches... only THEN do they actually have access to those specific photos to visually check, and then they decide whether to get anyone else involved.

Honestly, I'd rather have it the new way. One in a trillion per year are pretty good odd that I'll have absolutely nothing to worry about and Apple will never have access to any of my content.
they could simply run uploads through a proxy server and check that way, they would be doing it off device and they, not us would pay for the power usage and we the user wouldn't be convicted of a crime without a trial
 
  • Like
Reactions: Pummers
It is BECAUSE they have been scanning your phone for that info. Their lack of privacy rules has made their products so great when it comes to info like that.
very misleading, google doesn't "scan" your phone, they gather data from locations, search and so on and you can turn much of it off, plus, they don't hide what kind of company they are, everyone who uses a google / android phone knows that google makes money from your data, it's up to you to guard what you give out and how much

apple has not asked, nor have they offered any way to turn this spyware off, they are arrogantly assuming that because of their reputation, they can get away with putting spyware on people's phones
 
  • Like
Reactions: Pummers
1. Any narrative that begins with "the children," has nothing to do with CP and will have effectively 0 impact on the problem they're claiming to address. It's just an emotional button to push because pretty much nobody supports abusing children.
2. China has Apple's balls in a vice and they already caved in years ago. They're just rolling out a public service announcement to the governments of planet Earth that they're open for business. Replace CSAM database, with anything you want.
3. This makes Apple pretty much no different than every other tech company, except for the fact that Apple is in a unique position because only 2 months ago they were still the "we care about YOUR privacy" company. https://forums.macrumors.com/thread...ge-behind-csam-scanning.2307045/post-30161579
They used to: https://www.apple.com/customer-letter/
But as of last year, they caved in: https://www.reuters.com/article/us-...ps-after-fbi-complained-sources-idUSKBN1ZK1CT
Welcome to 1984 comrade, please do not engage in wrongthink, we have always been at war with Oceania. Your phone, your smart home, and your connected car, are all ensuring your continued compliance with what's best for the CIA, I meant to say: United States. We appreciate your cooperation.
right, if apple was much less "privacy focussed" this would seem much less insidious, if google announced this, i suspect that people would grumble but the reaction would be completely different
 
  • Like
Reactions: crawfish963
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.