That said, some do fear that once Apple has opened the door to this kind of on-device scanning and reporting, it could very well choose to build algorithms that would scan for a lot more, but again that's getting into extreme "What if?" scenarios. If you're willing to go down that road, you should have stopped using an iPhone years ago, as Apple "could" do just about anything it wants to behind your back.
The fear is that Apple could be strong armed into scanning for other stuff. They have no control of the hashes they receive from their sources, and have very little in the way of verifying them. The only verification they get is once an account is flagged, they can review the matches. Therefore our only protection from government abuse of the system is the people at Apple doing manual verification. Better hope they care about their work like Craig does, perhaps more so.
Apple’s method of detecting material is precisely what makes this dangerous. Doing a server-side only check on shared files is much safer for us and Apple. Their AI could scan all shared photos (not the library just being stored, but photos that are explicitly “Shared”) and look to see if any photos contain both children and nudity. If it does, hash it and compare to the database, and if it matches, flag for human review. This adds a layer of protection in that Apple’s image recognition AI has to detect possible CSAM, then positive hits have to match the database, and then a human at Apple looks at the report. If Apple’s AI doesn’t detect the presence of both nudity and children in the same photo, it doesn’t go any farther than that (which currently already occurs). This extra step should help prevent government abuse (internet memes and political crap never has a chance of being checked against the database). This also protects us in that it all happens on-server, and therefore removes the potential for the system to work without iCloud.
Not an ideal setup (rather do without any of this and keep photos as-is), but if we must, it has to be:
1. All on-server
2. Rely also on Apple’s image recognition AI (what already runs on our phones to enable image search, but run on-server instead and ONLY on shared photos, not the main library).
3. Be bundled with E2E encryption on ALL stored iCloud data (sharing photos with other people necessarily breaks E2EE on those photos only).
4. Mac/PC cable/Wi-Fi sync and optional iCloud features remain. We must retain the option of using iTunes or Photos to sync pictures/videos instead of iCloud, and we must retain the ability to leave iCloud turned off.