For me its the gateway to other invasions....
Probably started with google, "we have AI scan your emails to better serve you ads". No human involvement but I still don't want anyone doing that nor do I trust that anyone, especially Google, is not harvesting other data from my personal emails.
Now Apple wants to scan hashes of pictures that will be sent to iCloud because they don't want CSAM on their servers (nor does anyone). Next to no human involvement until they think your a pedo, then someone looks through the flagged pictures to see if they are CSAM, there is a chance the picture wasn't CSAM but now some stranger has already looked at YOUR PERSONAL PICTURES. Might have been a picture of a flower pot, might be a pic of a far more personal nature, no way to know until its too late.
THEN...
Apple (or any other manufacturer) can decide at any moment that they want to compare hashes of pictures NOT being sent to cloud based services, just pictures on your device. Sure, there will be mass outrage but "think of the children".
THEN...
Once we are comparing picture hashes then comes the question "what about video"? Pedos could be uploading CSAM videos to iCloud, now we need to use "AI" to scan your videos too. Much harder to do, much more invasive and much less accurate. In order not to falsely accuse anyone humans will need to review YOUR PERSONAL VIDEOS!
THEN...
What about your security cameras from Ring, Amazon, Eufy, Logitech, etc? All of these folks store video on their cloud systems and surely they don't want illegal materials on their servers either so now they start using "AI" to listen to the audio from your cameras for certain sounds, words, phrases, etc. Just to be sure no one is falsely accused humans will need to review YOUR PRIVATE AUDIO AND VIDEO!
THEN...
The government(s) get involved and want to get info on dissidents/opponents.
See what can happen? It always starts small and with something that is difficult to come out against, like CSAM, but as technology gets better (like hashing audio or video) the invasions into your privacy get deeper and deeper.
Probably started with google, "we have AI scan your emails to better serve you ads". No human involvement but I still don't want anyone doing that nor do I trust that anyone, especially Google, is not harvesting other data from my personal emails.
Now Apple wants to scan hashes of pictures that will be sent to iCloud because they don't want CSAM on their servers (nor does anyone). Next to no human involvement until they think your a pedo, then someone looks through the flagged pictures to see if they are CSAM, there is a chance the picture wasn't CSAM but now some stranger has already looked at YOUR PERSONAL PICTURES. Might have been a picture of a flower pot, might be a pic of a far more personal nature, no way to know until its too late.
THEN...
Apple (or any other manufacturer) can decide at any moment that they want to compare hashes of pictures NOT being sent to cloud based services, just pictures on your device. Sure, there will be mass outrage but "think of the children".
THEN...
Once we are comparing picture hashes then comes the question "what about video"? Pedos could be uploading CSAM videos to iCloud, now we need to use "AI" to scan your videos too. Much harder to do, much more invasive and much less accurate. In order not to falsely accuse anyone humans will need to review YOUR PERSONAL VIDEOS!
THEN...
What about your security cameras from Ring, Amazon, Eufy, Logitech, etc? All of these folks store video on their cloud systems and surely they don't want illegal materials on their servers either so now they start using "AI" to listen to the audio from your cameras for certain sounds, words, phrases, etc. Just to be sure no one is falsely accused humans will need to review YOUR PRIVATE AUDIO AND VIDEO!
THEN...
The government(s) get involved and want to get info on dissidents/opponents.
See what can happen? It always starts small and with something that is difficult to come out against, like CSAM, but as technology gets better (like hashing audio or video) the invasions into your privacy get deeper and deeper.
Last edited: