We can’t trust CSAM
CSAM isn't something you trust - it's something you should not be in possession of.
We can’t trust CSAM
Sounds like you really need a cabin in a remote woods area with no TV, internet, power, etc.Aaaand I'm spending the rest of the day removing everything I have from iCloud. Not because I am doing anything wrong but because any moderation system is prone to abuse and false positives and I literally cannot afford to get all my **** shut down one day.
I actually want to entirely exist offline at this point. Back like 2010 again. Files on disk. Physical backups on disks off site.
Good thing that MD5 is so easy to hack and create collisions for.md5 hash sums are viewable by Apple as well
Most of this stuff it not new; it's at least a decade old.
We are specifically referring to the system that involves identifying CSAM, and how such system can be used to scan other “unhealthy, dangerous” stuff determined by the government even if they are harmless based on the definition of CSAM. In short, this system opens the door to define anything possible as being dangerous, for no reason.CSAM isn't something you trust - it's something you should not be in possession of.
And the most extreme would be they don’t like “everything” thus allowing them to prosecute “anyone” at will.In reality this is so Project *** 2025 *** can take effect 😬 so they will scan for OTHER things THEY Don’t like.
We are specifically referring to the system that involves identifying CSAM, and how such system can be used to scan other “unhealthy, dangerous” stuff determined by the government even if they are harmless based on the definition of CSAM. In short, this system opens the door to define anything possible as being dangerous, for no reason.
Still, the government can scan pictures they don’t like (which can be any), and use the same technology used in CSAM scan to flag people that stored such images. After then, government can prosecute those individuals. Whether it is efficient or not is another matter, but it can be used in that manner.Aren't these scans based on hashes generated by NCMEC for the specific purpose of identifying CSAM? Moreover, they aren't a part of any government.
Moreover, the way this scanning works isn't by using image recognition or something. It compares hashes, mathematical representations of them. It only flags hashes of files that match the hash of known CSAM.
Edit to add: Don't take my word for it. Here's Apple's whitepaper on how their on-device scanning was going to work before they scrapped it.
how?
Still, the government can scan pictures they don’t like (which can be any), and use the same technology used in CSAM scan to flag people that stored such images. After then, government can prosecute those individuals. Whether it is efficient or not is another matter, but it can be used in that manner.
This post should be the tagline for Apple Intelligence.this would appears CSAM scanning imo
I think it’s a type and forget here… so no problem if it’s lost 😜If you want to exist enitrely offline why did you post this online?
how?
Why is it so hard for people to just turn off “photos” and put your photos onto an external hard drive as backup?
Kind of seems like you are giving South Park too much credit.How did they nail it? I couldn't find a decent image from the Human CentiPad episode. Everyone just clicked agree and then Apple came and started collecting them for a human centipede, which they agreed to in the terms and conditions.
Butters is the only one they didn't come for, because he actually read it and clicked disagree.
How could someone actually disagree to the terms in real life? They'd have to immediately stop using all Apple products, without even a chance to get back in to get their data out without agreeing.
Both scenarios are completely ridiculous and should be illegal. Which is (was?) South Park's genius. Demonstrating that fact with an over the top, technically plausible scenario.