CSAM scanning does not occur until you are about to post an image or save it into the cloud. Once the image is in the wild, the damage/crime is done.CSAM scanning on device is unacceptable.
CSAM scanning does not occur until you are about to post an image or save it into the cloud. Once the image is in the wild, the damage/crime is done.CSAM scanning on device is unacceptable.
Apple collects billions from Google to allow Google to scan your iPhone for targeted advertisement.I did not kew Apple was selling our information to advertisers like Google and Facebook are.
How does scanning my phone for content that I don't have beneficial to me?The CSAM thing is for our benefit. Anything in the cloud does not have privacy and pictures on cloud-services are being regularly scanned. Data that exists only on your phone is safe. Apple's CSAM implementation is basically warning you before posting/uploading something that might get you into trouble. IMO, this is a good thing.
In this case, I don't think Apple intentionally did anything wrong, but I do think the trial should move forward so the plaintiffs have an opportunity to prove their claims in court and Apple can present its defense.Good. Now queue the usual tirade of Apple apologists explaining that Apple getting out of this lawsuit would actually be a good thing.
Apple collects billions from Google to allow Google to scan your iPhone for targeted advertisement.
Last report I saw was that Apple has collected $15 BILLION from Google which means Google is making at least $30 Billion in return.
How does scanning my phone for content that I don't have beneficial to me?
People like you would sell their soul if it benefitted Apple.
This is nothing new. When required by a government, they have always searched for whatever the government demanded.The biggest problem I have is not that Apple is scanning for CSAM. It's that Apple decided the criteria for which they were willing to compromise our collective privacy. Now that they have proven they will do it once there is no guarantee that they will not do it for another cause they feel strongly about or are forced to do it for by an outside authority. The proverbial foot is in the door and it won't be long before they are sitting at the table having a cup of tea.
This is nothing new. When required by a government, they have always searched for whatever the government demanded.
The difference, now, is that by using this system Apple can switch to end-to-end encryption for icloud photos, so that they cannot, ,in the future, search your icloud photos.
The big difference is they did not search on my phone. They are not "iCloud photos" until they are actually on iCloud.
But they aren’t searching on your phone now, either, unless the photo is about to be uploaded onto icloud. If you turn off icloud photo sync, then there is no scanning. Isn’t it better that any scanning happen on your device, where security researchers can see what’s happening, than in a cloud farm?
Apple: We put a bomb on everyone's phone
Everyone: WHAT?
Apple: Don't worry we only detonate it when you use a specific service and two agencies you never heard of before determine you did this one thing we think is very bad.
Everyone: YOU. PUT. A. BOMB. ON. OUR. PHONES....
Apple: Don't worry. The bomb is totally safe. Here, look at this really sophisticated process we use to make sure we don't accidentally detonate it or let someone else detonate it that is not supposed to.
Everyone: Um.. and who decides the criteria for this process?
Apple: We do.
If you have to exaggerate ridiculously to make the thing you're complaining about look bad, you might not actually have a point. Everybody seems in a competition to make CSAM hash detection seem horrifying. But when someone says something edifying and actually reasonable sounding, you respond with... a bomb. Seriously?Apple: We put a bomb on everyone's phone
Everyone: WHAT?
Apple: Don't worry we only detonate it when you use a specific service and two agencies you never heard of before determine you did this one thing we think is very bad.
Everyone: YOU. PUT. A. BOMB. ON. OUR. PHONES....
Apple: Don't worry. The bomb is totally safe. Here, look at this really sophisticated process we use to make sure we don't accidentally detonate it or let someone else detonate it that is not supposed to.
Everyone: Um.. and who decides the criteria for this process?
Apple: We do.
And actually I think you're refuting nonsense with falsehood. Everyone in fact *can't* see exactly what the CSAM detectors are searching for. As far as I know, Apple hasn't published either the list of hashes or the algorithm for hashing. And almost certainly they shouldn't, since that would enable the baddies to possibly obfuscate their CSAM pix. Seeing a list of hashes wouldn't count as "everyone can see exactly what they check for" either. For damned sure you can't see the pictures those hashes pertain to. Not that I'd want to see something that would likely give me nightmares, and if anybody did want to see them, they'd be illegal to view, by their nature. OTOH, it would be nice to be sure that the hashes about 6 year olds who couldn't be mistaken for anything older, and not 17 year olds could hardly be thought under 20.So? Again, everyone can see exactly what they check for. They are doing it in the open. Worry about when they decide to search for something controversial. I don’t think searching for known child exploitation photos is controversial.
If you have to exaggerate ridiculously to make the thing you're complaining about look bad, you might not actually have a point. Everybody seems in a competition to make CSAM hash detection seem horrifying. But when someone says something edifying and actually reasonable sounding, you respond with... a bomb. Seriously?
And actually I think you're refuting nonsense with falsehood. Everyone in fact *can't* see exactly what the CSAM detectors are searching for. As far as I know, Apple hasn't published either the list of hashes or the algorithm for hashing. And almost certainly they shouldn't, since that would enable the baddies to possibly obfuscate their CSAM pix. Seeing a list of hashes wouldn't count as "everyone can see exactly what they check for" either. For damned sure you can't see the pictures those hashes pertain to. Not that I'd want to see something that would likely give me nightmares, and if anybody did want to see them, they'd be illegal to view, by their nature. OTOH, it would be nice to be sure that the hashes about 6 year olds who couldn't be mistaken for anything older, and not 17 year olds could hardly be thought under 20.
Not at all falsehood. What Craig F. Said in an interview:
Because Apple distributes the same version of each of its operating systems globally and the encrypted CSAM hash database is bundled rather than being downloaded or updated over the Internet, Apple claims that security researchers will be able to inspect every release.
That literally means nothing. The hashes can be all encompassing with a global set of data for all government that want to use it and iOS has a history of functioning differently in different regions e.g. Facetime used to not be enabled in Saudi Arabia and the first run experience in Russia allows the selection of apps from local developers to be picked from a mini AppStore list during setup despite being the same global iOS binary.
That’s not what “literally“ means. The interview says its the same hash EVERYWHERE as it is distributed as part of the OS image, and this can be confirmed by security researchers. Researchers have actually already found old versions of the hash and algorithm on the phones. If saudi arabia or russia gets a different hash, it will be readily apparent to them.
This conversation is weak. You should know that anybody can see what is in the OS firmware, even WITHOUT an iphone, because you can literally download it off apple’s website and analyze it.
Unless you know what is in the hash, it LITERALLY means nothing. You can hash up multiple sets of data (one of them being CSAM) to search for in the same file and have the same code do multiple things at the same time or work with subsets of hashes in the file depending on region. The transparency argument is totally bunk.
Unless you know what is in the hash, it LITERALLY means nothing. You can hash up multiple sets of data (one of them being CSAM) to search for in the same file and have the same code do multiple things at the same time or work with subsets of hashes in the file depending on region. The transparency argument is totally bunk.
A lot of ignorants support Apple, it’s sad and a waste of time trying to convince them a surveillance algorithm is not good.Apple is going to have it rough the next years now all the big boys are out to get them.
Weird how suddenly everybody stopped talking about CSAM though. Seems people got tired of complaining about it lol