I think they ARE respecting that dynamic. Again, every single user has the choice to disable iCloud for photos. Isn't that control?
I guess it is - maybe I just don't like that I don't get a say in this being implemented. Then, obviously, it becomes a subjectively acceptable level of control.
Oh, it's not a "few" people attempting to upload CSAM. It's a pandemic and has been for a long time. Believe me, Apple (and other cloud services) would not be spending all these resources and time on CSAM detection if it were some small problem.
You're 1,000% correct and I definitely didn't mean to minimize that - I was just trying to illustrate a point, but you are absolutely correct about this.
Yes, it will be on their servers, but in effect "quarantined" for review and not able to be distributed. As for the "exact same amount of time" - I don't know. Was the server-side scan happening in real time as the photos were uploaded or just periodically? In any case, the point of this change isn't to reduce time on server or anything but to rather to continue to do what they've always been doing in a more private manner. That's it! I don't think Apple ever claimed that the method is more effective when done on the device itself - just more private.
I'm trying to shift my mindset on this to make it more acceptable, b/c I prefer my iPhone over others, and want to do anything possible to help stop CSAM, CP, pedophilia etc. (though I still wonder how much this is truly helping with that).
I'm just having a hard time shaking the fact that, again, it's being done on my device without my say. And I believe there are trade-offs to how Apple is implementing this that offset what I view as a small privacy benefit.