Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I did not kew Apple was selling our information to advertisers like Google and Facebook are.
Apple collects billions from Google to allow Google to scan your iPhone for targeted advertisement.

Last report I saw was that Apple has collected $15 BILLION from Google which means Google is making at least $30 Billion in return.
 
The CSAM thing is for our benefit. Anything in the cloud does not have privacy and pictures on cloud-services are being regularly scanned. Data that exists only on your phone is safe. Apple's CSAM implementation is basically warning you before posting/uploading something that might get you into trouble. IMO, this is a good thing.
How does scanning my phone for content that I don't have beneficial to me?

People like you would sell their soul if it benefitted Apple.
 
ha, the drug deals:
“So, while I was doing my drug dealing activities, Siri recorded me… I want $5000 for that”
 
Good. Now queue the usual tirade of Apple apologists explaining that Apple getting out of this lawsuit would actually be a good thing.
In this case, I don't think Apple intentionally did anything wrong, but I do think the trial should move forward so the plaintiffs have an opportunity to prove their claims in court and Apple can present its defense.
 
Last edited:
  • Like
Reactions: MuppetGate
Apple collects billions from Google to allow Google to scan your iPhone for targeted advertisement.

Last report I saw was that Apple has collected $15 BILLION from Google which means Google is making at least $30 Billion in return.

Google doesn’t “scan your iPhone,” not even for targeted advertisement.
 
How does scanning my phone for content that I don't have beneficial to me?

People like you would sell their soul if it benefitted Apple.

The biggest problem I have is not that Apple is scanning for CSAM. It's that Apple decided the criteria for which they were willing to compromise our collective privacy. Now that they have proven they will do it once there is no guarantee that they will not do it for another cause they feel strongly about or are forced to do it for by an outside authority. The proverbial foot is in the door and it won't be long before they are sitting at the table having a cup of tea.
 
The biggest problem I have is not that Apple is scanning for CSAM. It's that Apple decided the criteria for which they were willing to compromise our collective privacy. Now that they have proven they will do it once there is no guarantee that they will not do it for another cause they feel strongly about or are forced to do it for by an outside authority. The proverbial foot is in the door and it won't be long before they are sitting at the table having a cup of tea.
This is nothing new. When required by a government, they have always searched for whatever the government demanded.

The difference, now, is that by using this system Apple can switch to end-to-end encryption for icloud photos, so that they cannot, ,in the future, search your icloud photos.
 
This is nothing new. When required by a government, they have always searched for whatever the government demanded.

The difference, now, is that by using this system Apple can switch to end-to-end encryption for icloud photos, so that they cannot, ,in the future, search your icloud photos.

The big difference is they did not search on my phone. They are not "iCloud photos" until they are actually on iCloud.
 
The big difference is they did not search on my phone. They are not "iCloud photos" until they are actually on iCloud.

But they aren’t searching on your phone now, either, unless the photo is about to be uploaded onto icloud. If you turn off icloud photo sync, then there is no scanning. Isn’t it better that any scanning happen on your device, where security researchers can see what’s happening, than in a cloud farm?
 
  • Like
Reactions: ImaginaryNerve
But they aren’t searching on your phone now, either, unless the photo is about to be uploaded onto icloud. If you turn off icloud photo sync, then there is no scanning. Isn’t it better that any scanning happen on your device, where security researchers can see what’s happening, than in a cloud farm?

Apple: We put a bomb on everyone's phone

Everyone: WHAT?

Apple: Don't worry. We only detonate it when you use a specific service and two agencies you never heard of before determine you did this one thing we think is very bad.

Everyone: YOU. PUT. A. BOMB. ON. OUR. PHONES....

Apple: Don't worry. The bomb is totally safe. Here, look at this really sophisticated process we use to make sure we don't accidentally detonate it or let someone else detonate it that is not supposed to.

Everyone: Um.. and who decides the criteria for this process?

Apple: We do.
 
"One user in the lawsuit claimed that his private discussions with his doctor about a "brand name surgical treatment" caused him to receive targeted ads for that treatment..."

You know, because for damned sure he didn't Google it afterward, and Google wouldn't have sent targeted ads anyhow. No sir. The user talked to his doctor about this thing and didn't ever mention it anywhere, ever.

Also, can I please sue Google over all my targeted ads? Because I've sure had a lot of those.
 
Apple: We put a bomb on everyone's phone

Everyone: WHAT?

Apple: Don't worry we only detonate it when you use a specific service and two agencies you never heard of before determine you did this one thing we think is very bad.

Everyone: YOU. PUT. A. BOMB. ON. OUR. PHONES....

Apple: Don't worry. The bomb is totally safe. Here, look at this really sophisticated process we use to make sure we don't accidentally detonate it or let someone else detonate it that is not supposed to.

Everyone: Um.. and who decides the criteria for this process?

Apple: We do.

So? Again, everyone can see exactly what they check for. They are doing it in the open. Worry about when they decide to search for something controversial. I don’t think searching for known child exploitation photos is controversial.
 
Apple: We put a bomb on everyone's phone

Everyone: WHAT?

Apple: Don't worry we only detonate it when you use a specific service and two agencies you never heard of before determine you did this one thing we think is very bad.

Everyone: YOU. PUT. A. BOMB. ON. OUR. PHONES....

Apple: Don't worry. The bomb is totally safe. Here, look at this really sophisticated process we use to make sure we don't accidentally detonate it or let someone else detonate it that is not supposed to.

Everyone: Um.. and who decides the criteria for this process?

Apple: We do.
If you have to exaggerate ridiculously to make the thing you're complaining about look bad, you might not actually have a point. Everybody seems in a competition to make CSAM hash detection seem horrifying. But when someone says something edifying and actually reasonable sounding, you respond with... a bomb. Seriously?
 
  • Like
Reactions: ImaginaryNerve
So? Again, everyone can see exactly what they check for. They are doing it in the open. Worry about when they decide to search for something controversial. I don’t think searching for known child exploitation photos is controversial.
And actually I think you're refuting nonsense with falsehood. Everyone in fact *can't* see exactly what the CSAM detectors are searching for. As far as I know, Apple hasn't published either the list of hashes or the algorithm for hashing. And almost certainly they shouldn't, since that would enable the baddies to possibly obfuscate their CSAM pix. Seeing a list of hashes wouldn't count as "everyone can see exactly what they check for" either. For damned sure you can't see the pictures those hashes pertain to. Not that I'd want to see something that would likely give me nightmares, and if anybody did want to see them, they'd be illegal to view, by their nature. OTOH, it would be nice to be sure that the hashes about 6 year olds who couldn't be mistaken for anything older, and not 17 year olds could hardly be thought under 20.
 
  • Like
Reactions: bobcomer
If you have to exaggerate ridiculously to make the thing you're complaining about look bad, you might not actually have a point. Everybody seems in a competition to make CSAM hash detection seem horrifying. But when someone says something edifying and actually reasonable sounding, you respond with... a bomb. Seriously?

Yes, seriously. Because some people don't understand software and the ramifications of software acting as a pseudo agent of law enforcement or how mandates of the well-intentioned can easily change or be corrupted or, in case you are unaware of what article you are commenting on, how personal data can be inadvertently leaked and possibly abused no matter how many precautions are taken.
 
  • Like
Reactions: PC_tech
And actually I think you're refuting nonsense with falsehood. Everyone in fact *can't* see exactly what the CSAM detectors are searching for. As far as I know, Apple hasn't published either the list of hashes or the algorithm for hashing. And almost certainly they shouldn't, since that would enable the baddies to possibly obfuscate their CSAM pix. Seeing a list of hashes wouldn't count as "everyone can see exactly what they check for" either. For damned sure you can't see the pictures those hashes pertain to. Not that I'd want to see something that would likely give me nightmares, and if anybody did want to see them, they'd be illegal to view, by their nature. OTOH, it would be nice to be sure that the hashes about 6 year olds who couldn't be mistaken for anything older, and not 17 year olds could hardly be thought under 20.

Not at all falsehood. What Craig F. Said in an interview:

Because Apple distributes the same version of each of its operating systems globally and the encrypted CSAM hash database is bundled rather than being downloaded or updated over the Internet, Apple claims that security researchers will be able to inspect every release.
 
Not at all falsehood. What Craig F. Said in an interview:

Because Apple distributes the same version of each of its operating systems globally and the encrypted CSAM hash database is bundled rather than being downloaded or updated over the Internet, Apple claims that security researchers will be able to inspect every release.

That literally means nothing. The hashes can be all encompassing with a global set of data for all government that want to use it and iOS has a history of functioning differently in different regions e.g. Facetime used to not be enabled in Saudi Arabia and the first run experience in Russia allows the selection of apps from local developers to be picked from a mini AppStore list during setup despite being the same global iOS binary.
 
  • Like
Reactions: PC_tech
That literally means nothing. The hashes can be all encompassing with a global set of data for all government that want to use it and iOS has a history of functioning differently in different regions e.g. Facetime used to not be enabled in Saudi Arabia and the first run experience in Russia allows the selection of apps from local developers to be picked from a mini AppStore list during setup despite being the same global iOS binary.

That’s not what “literally“ means. The interview says its the same hash EVERYWHERE as it is distributed as part of the OS image, and this can be confirmed by security researchers. Researchers have actually already found old versions of the hash and algorithm on the phones. If saudi arabia or russia gets a different hash, it will be readily apparent to them.

This conversation is weak. You should know that anybody can see what is in the OS firmware, even WITHOUT an iphone, because you can literally download it off apple’s website and analyze it.
 
  • Like
Reactions: ImaginaryNerve
That’s not what “literally“ means. The interview says its the same hash EVERYWHERE as it is distributed as part of the OS image, and this can be confirmed by security researchers. Researchers have actually already found old versions of the hash and algorithm on the phones. If saudi arabia or russia gets a different hash, it will be readily apparent to them.

This conversation is weak. You should know that anybody can see what is in the OS firmware, even WITHOUT an iphone, because you can literally download it off apple’s website and analyze it.

Unless you know what is in the hash, it LITERALLY means nothing. You can hash up multiple sets of data (one of them being CSAM) to search for in the same file and have the same code do multiple things at the same time or work with subsets of hashes in the file depending on region. The transparency argument is totally bunk.
 
  • Like
Reactions: PC_tech
Unless you know what is in the hash, it LITERALLY means nothing. You can hash up multiple sets of data (one of them being CSAM) to search for in the same file and have the same code do multiple things at the same time or work with subsets of hashes in the file depending on region. The transparency argument is totally bunk.

What are you talking about? The reference hash is a known set distributed by NCMEC. Every tech company uses it. So when researchers look at the phone they will either see the expected NCMEC hash values, or they will see something else. If they see something else they will raise the alarm - “apple appears to be looking for something other than child porn!”. If the hash values DO match the NCMEC values, then the chances that they are all actually searching for something else is astronomically small.
 
Last edited:
  • Like
Reactions: cyanite
Unless you know what is in the hash, it LITERALLY means nothing. You can hash up multiple sets of data (one of them being CSAM) to search for in the same file and have the same code do multiple things at the same time or work with subsets of hashes in the file depending on region. The transparency argument is totally bunk.

From the lengthy technical documentation for CSAM:

”the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child-safety organizations
 
  • Like
Reactions: cyanite
Apple is going to have it rough the next years now all the big boys are out to get them.

Weird how suddenly everybody stopped talking about CSAM though. Seems people got tired of complaining about it lol
A lot of ignorants support Apple, it’s sad and a waste of time trying to convince them a surveillance algorithm is not good.
 
  • Disagree
  • Love
Reactions: PC_tech and cyanite
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.