Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Google reading emails and scanning photos is written in their TOS Apple will include a detailed version in their TOS too.
I have mostly said that for those who think to go to Android will solve the problem.
And the face they will do when they will see the update time of their hardware (if update they have), I will pay to see that.

edit : for those who doesn't know if you exclude pixel from google.
I nerver had a smartphone with android who had more than 2 different version of android and NEVER at the launch time.
 
Last edited:
If Apple was serious about privacy, they would be much more focused on securing their OS. If there is such functionality it will be exploited by malware.
Why is Apple so quiet about Pegasus. Countless iPhones where compromised by Spyware as a Service. And the next thing they announce is this on device spying functionality.
I seriously need to look into Android ROMs.
 
If Apple was serious about privacy, they would be much more focused on securing their OS. If there is such functionality it will be exploited by malware.
Why is Apple so quiet about Pegasus. Countless iPhones where compromised by Spyware as a Service. And the next thing they announce is this on device spying functionality.
I seriously need to look into Android ROMs.
Android ROM means unlocked boot loader mean very easy for a malware to infect it.
it's the same with jailbreak and android root trough.

And like I said android device is never updated as long as an iPhone
 
  • Disagree
Reactions: arvinsim
Apple wouldn't decrypt a freakin terrorists phone to help the investigation. Yet they are doing this on a massive scale now.
This was the first thing I thought of.

People were ACTUALLY killed by these terrorist and they told the FBI to kick rocks. Now its ok to flip through my sh*t (even if its only a hash :rolleyes:) for a "potential". Sure, they'll catch a few stupid creeps, but MILLIONS of innocent people have to be searched for that to happen...no
 
Still interesting to see the usual Apple Apologist no where to be seen.
 
Still interesting to see the usual Apple Apologist no where to be seen.
And also interesting to see this many user who talk about a protocol they don't understand (or read) the official PDF of how it work and only speak after having read an "article" on internet.
 
  • Like
Reactions: Ta0jin
It’s funny to think that siri is terrible and we are just going to accept this thing won’t have tons of false positives cause Apple says so, google images will always find tons of images that are very similar and google actually has good ai. I guess have fun letting Apple look through all your nudes that get falsely picked up to catch basically no criminals cause they would have to be so dumb to upload to iCloud.
 
  • Like
Reactions: JeMeCasse
It’s funny to think that siri is terrible and we are just going to accept this thing won’t have tons of false positives cause Apple says so, google images will always find tons of images that are very similar and google actually has good ai. I guess have fun letting Apple look through all your nudes that get falsely picked up to catch basically no criminals cause they would have to be so dumb to upload to iCloud.
Yeah and user will read official documentation instead of commenting on an article who try to explain a technology you are not understanding at all.

But cat will fly before that.
 
Yeah and user will read official documentation instead of commenting on an article who try to explain a technology you are not understanding at all.

But cat will fly before that.
The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images.

Here is a quote directly from Apple on the front page of this site, as you can see they will get info from photos that get tagged, and as I explained before you have to be pretty blind if you don’t think they won’t have tons of false positives.

So yea reading official documentation before posting on a site is pretty silly.
 
  • Disagree
Reactions: Elodes1693
depends on your semantics of "scanning".

A hash-function is applied to the picture on-device (that is an information destroying function, so I wouldn't call that a scan). In case of a match, the picture is flagged with this "safety voucher" - which probably has the necessary encryption keys, that an Apple dedicated team might inspect those photos, once the threshold has been exceeded.

So privacy is reduced only in those cases and for those pictures, where a match was found.
The probability that that happens in case of a non-CSAM picture is related to the "coarseness" of the hash-algorithm,
by using a finer hash function Apple could further reduce this risk.

If the mechanism works as designed in my view the benefits outweigh the risk of the privacy intrusion.
However Apple should first prove it can make the mechanism tamper-proof before even thinking of opening it up to third parties.
I am sure that the data is as minimal as it can be that is attached to the photo, however, if one has many thousands of photos and does not sync to iCloud it is still wasting space on the drive. I don't appreciate that either. I do not question that catching child abusers is a worthy cause, but they are doing this to catch the very few disgusting perverts who are also totally not paranoid of cloud storage and too dumb to know that syncing photos like this to a cloud system is a bad idea. They are doing all of this for what surely is a small portion of these perverted bastards, meaning they are not going to prevent much while they cause inconveniences, potential risk, and wasted storage for their users. The better solution would be to keep 100% of this on iCloud with no data added to the phone and to factor the added iCloud storage caused by this data out of the storage quota for the 99.99.......% of users who do not have this type of content anyway.
 
Last edited:
  • Love
Reactions: peanuts_of_pathos
So, you have a new function/API in iOS, something like, is_it_CSAM(img). You as some third party app (good or nefarious), give this new function an image and it tells you “true” or “false”, answering the question “does this image match one of the hashes in the CSAM hash database that’s baked into iOS 15?” - you can’t supply anything other than an image, and it only answers yes or no.

Please explain how some nefarious actor, Chinese or otherwise, can do something bad with this. Not hand waving. An actual bad thing they could do. You seem very certain there’s an obvious horrible problem here, so it should be easy to explain.
Simply by providing a database full of "alternative" hash'd files to search against rather than the CSAM ones.

At the end of the day it's just a database, the parameters can be tweaked based on the legal requirements of that country. And yes, there will be requirements cooked up.
 
These sick p3@dos aren't going to be using iCloud to begin with. Okay, a few thick ones probably will but the numbers will be tiny and even less now - Apple news travels and this story is getting picked up everywhere; The perverts will be cleaning out their iClouds as we speak.

So by the time ios15 lands they'll have deleted their ****, everyone will have agreed to the new T&Cs because we'll be excited to download the new OS and BOOM - there's the back door.
 
  • Like
Reactions: peanuts_of_pathos
The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images.

Here is a quote directly from Apple on the front page of this site, as you can see they will get info from photos that get tagged, and as I explained before you have to be pretty blind if you don’t think they won’t have tons of false positives.

So yea reading official documentation before posting on a site is pretty silly.
Your are silly indeed : https://www.apple.com/child-safety/...s_for_Children_Frequently_Asked_Questions.pdf is an FAQ not the technical data go do your homework
 
  • Disagree
Reactions: goobot
Your are silly indeed : https://www.apple.com/child-safety/...s_for_Children_Frequently_Asked_Questions.pdf is an FAQ not the technical data go do your homework
“Using new applications of cryptography, Apple is able to use these hashes to learn only about iCloud Photos accounts that are storing collections of photos that match to these known CSAM images, and is then only able to learn about photos that are known CSAM, without learning about or seeing any other photos”

”Apple conducts human review before making a report to NCMEC.”
 
  • Disagree
Reactions: Elodes1693
And how do you know it will be low false positives? Cause Apple has such a great track record when it comes to ai? You have no way to base that while I’m using real life evidence.
I have to agree I am skeptical of the low false positives claims from Apple. In addition Google has been using machine learning for awhile now and we know that it does incorrectly flagging content on a regular basis. Even if Apple is correct that their implementation has a low false positive rate that is still going to be a lot of customers considering how many customers they have and the amount of photos taken.
 
I have mostly said that for those who think to go to Android will solve the problem.
And the face they will do when they will see the update time of their hardware (if update they have), I will pay to see that.

edit : for those who doesn't know if you exclude pixel from google.
I nerver had a smartphone with android who had more than 2 different version of android and NEVER at the launch time.

It's possible for privacy freaks to go completely off big tech Apple, Microsoft and Google with Linux and for phones custom roms work the likes of e-foundation, Graphene os, Lineage os, Calyx os etc etc all Google free
 
  • Like
Reactions: peanuts_of_pathos
It's possible for privacy freaks to go completely off big tech Apple, Microsoft and Google with Linux and for phones custom roms work the likes of e-foundation, Graphene os, Lineage os, Calyx os etc etc all Google free
Yeah it's possible to push privacy at max but you loose security in the process.
So even apple user must always have in head that sometime security matter most than privacy,
and some time privacy matter most than security.

But here they are just nut who reject in block this "with paper" so it's not even implemented.
without even trying to understand how they make it work (because blur hash not work at all like a stupid hash (md5 or any other hash).
 
Android ROM means unlocked boot loader mean very easy for a malware to infect it.
it's the same with jailbreak and android root trough.

And like I said android device is never updated as long as an iPhone
Roms from Graphene os and Calyx os are locked only Lineage os that is unlocked.
 
Roms from Graphene os and Calyx os are locked only Lineage os that is unlocked.
maybe but you will never have the same update time as apple in the end (or even pixel if you prefer google), so in the end you lose some update in the process.
i have tried many android phone between my iPhone 4 and the iPhone X and I have come back to apple because Android update (even security one) is a real mess.
 
I understand the privacy implication I'm just bored of all stupidity I read ... the only thing apple have to explain is how they ensure the hash list isn't modified by a third party between CSAM and our iPhone the rest are just speculation and misunderstanding.
Because I pretty sure none of your have even read how the neural engine work or how they can blur hash even work.
But yeah you know more ... I have my dose of stupidity for this week good week to you.
Yeah, I'm going to go on a limb here and say that more than a few of us here know how the neural engine works. I'd also go on a limb here and say that more then a few of us have jobs where we would have to manage teams that would have to implement features like this as dictated by the business side of the house. Part of that job is make sure you create a feature/product/service that hits it's intended purpose but does not allow the same item to be abused. Once again you are looking at this in the most direct fashion and not looking at the big picture.

I can help you understand this a little bit further.
You are a "dev". You are thinking like a jr dev though... think outside of the immediate problem and think of the way the feature could be abused. That is 101 stuff right there... you should know better then that.
 
Yeah, I'm going to go on a limb here and say that more than a few of us here know how the neural engine works. I'd also go on a limb here and say that more then a few of us have jobs where we would have to manage teams that would have to implement features like this as dictated by the business side of the house. Part of that job is make sure you create a feature/product/service that hits it's intended purpose but does not allow the same item to be abused. Once again you are looking at this in the most direct fashion and not looking at the big picture.

I can help you understand this a little bit further.
You are a "dev". You are thinking like a jr dev though... think outside of the immediate problem and think of the way the feature could be abused. That is 101 stuff right there... you should know better then that.
You know what think what you think bash apple, and during this time adult will solve the problem and it will be in iOS 15 because only 1 or 2 % of privacy freak will be upset by this.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.