This intrusive way to search phones for vile material will go wrong, this type of technology always goes wrong. In the UK where I live Maybe 7 year old (not 10 as someone as else said said yuk) at the beach or babies in the bath with mum and dad could potentially flag the system one day if its near enough to what looks like a known image and a family will have thier lives runied by a mistake made my an automated creep going though your images. Face recognition technology isnt good enough yet even though the London Met use it, and its made mistakes, and apples system will make mistakes too.
https://www.wired.co.uk/article/face-recognition-police-uk-south-wales-met-notting-hill-carnival
As others have mentioned the potential for certain countries like Saudi Arabia to use this to persecute say the LGBTQ+ community or so called 'dissidents' who stand up to the corrupt and brutal government in Belarus are huge, this will be abused by them at some point I am sure. Apple have opened a huge can of worms here. Ive never used iCloud, I just dont trust what isn't on a secure thumbdrive saved safely at home or at a off premises location as a back up. I for the first time wont update to the latest iOS 15 even though I don't use iCloud because I don't agree that this is the best way to get paedophiles from swapping images, soon it will be all Apple devices that will use this tech and the idea of and that's way to 1984 ish for me, let alone buy a new iPhone with this baked in.
Child abuse is horrendous I went though physical violent abuse as a child, but this is not the way to fight it, so much for what happens on your iPhone stays on your iPhone.There are better ways to fight paedophile gangs I'm sure that we don't know about that the forces already use I would imagine than this very intrusive way to rifle though your images. All they need is to get one image wrongly classified and this will be a train wreck. In countries where the age of consent is different (16 in the UK) or beach pictures of bath time with the kids is seen in a different light to say the USA or Canada and the whole house of cards that is already wobbling and this could ruin Apples reputation when not if this happens.
The fact governments are loving this idea should say enough. At last a way onto your phone as Apple fold to give governments around the world the control they want over individuals who like to be private for very meaningful reasons in countries where their lives could be at risk. Also the "if you've done nothing wrong" really doesn't apply here, as being gay in Saudi Arabia isn't wrong but it sure will be easier to find out now and the governments will get access its a given, its not safe technology and I for one at this point in time wont be loading iOS 15 or getting a new iPhone because I don't trust technology like this not to f*ck up and cause immense harm to someone innocent, while vile images are swapped in all probability not electronically because paedophiles and other criminals probably don't trust the web to spread their vile and disgusting habits anyway. The EFF said it very well in this article below.
"Apple's filtering of iMessage and iCloud is not a slippery slope to backdoors that suppress speech and make our communications less secure. We’re already there: this is a fully-built system just waiting for external pressure to make the slightest change."
https://www.eff.org/deeplinks/2021/...t-encryption-opens-backdoor-your-private-life