Google reads the emails and scans photos on their servers not the cellular device.Google reading emails and scanning photos is written in their TOS Apple will include a detailed version in their TOS too.
Google reads the emails and scans photos on their servers not the cellular device.Google reading emails and scanning photos is written in their TOS Apple will include a detailed version in their TOS too.
I have mostly said that for those who think to go to Android will solve the problem.Google reading emails and scanning photos is written in their TOS Apple will include a detailed version in their TOS too.
Android ROM means unlocked boot loader mean very easy for a malware to infect it.If Apple was serious about privacy, they would be much more focused on securing their OS. If there is such functionality it will be exploited by malware.
Why is Apple so quiet about Pegasus. Countless iPhones where compromised by Spyware as a Service. And the next thing they announce is this on device spying functionality.
I seriously need to look into Android ROMs.
This was the first thing I thought of.Apple wouldn't decrypt a freakin terrorists phone to help the investigation. Yet they are doing this on a massive scale now.
And also interesting to see this many user who talk about a protocol they don't understand (or read) the official PDF of how it work and only speak after having read an "article" on internet.Still interesting to see the usual Apple Apologist no where to be seen.
Yeah and user will read official documentation instead of commenting on an article who try to explain a technology you are not understanding at all.It’s funny to think that siri is terrible and we are just going to accept this thing won’t have tons of false positives cause Apple says so, google images will always find tons of images that are very similar and google actually has good ai. I guess have fun letting Apple look through all your nudes that get falsely picked up to catch basically no criminals cause they would have to be so dumb to upload to iCloud.
The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images.Yeah and user will read official documentation instead of commenting on an article who try to explain a technology you are not understanding at all.
But cat will fly before that.
I am sure that the data is as minimal as it can be that is attached to the photo, however, if one has many thousands of photos and does not sync to iCloud it is still wasting space on the drive. I don't appreciate that either. I do not question that catching child abusers is a worthy cause, but they are doing this to catch the very few disgusting perverts who are also totally not paranoid of cloud storage and too dumb to know that syncing photos like this to a cloud system is a bad idea. They are doing all of this for what surely is a small portion of these perverted bastards, meaning they are not going to prevent much while they cause inconveniences, potential risk, and wasted storage for their users. The better solution would be to keep 100% of this on iCloud with no data added to the phone and to factor the added iCloud storage caused by this data out of the storage quota for the 99.99.......% of users who do not have this type of content anyway.depends on your semantics of "scanning".
A hash-function is applied to the picture on-device (that is an information destroying function, so I wouldn't call that a scan). In case of a match, the picture is flagged with this "safety voucher" - which probably has the necessary encryption keys, that an Apple dedicated team might inspect those photos, once the threshold has been exceeded.
So privacy is reduced only in those cases and for those pictures, where a match was found.
The probability that that happens in case of a non-CSAM picture is related to the "coarseness" of the hash-algorithm,
by using a finer hash function Apple could further reduce this risk.
If the mechanism works as designed in my view the benefits outweigh the risk of the privacy intrusion.
However Apple should first prove it can make the mechanism tamper-proof before even thinking of opening it up to third parties.
Simply by providing a database full of "alternative" hash'd files to search against rather than the CSAM ones.So, you have a new function/API in iOS, something like, is_it_CSAM(img). You as some third party app (good or nefarious), give this new function an image and it tells you “true” or “false”, answering the question “does this image match one of the hashes in the CSAM hash database that’s baked into iOS 15?” - you can’t supply anything other than an image, and it only answers yes or no.
Please explain how some nefarious actor, Chinese or otherwise, can do something bad with this. Not hand waving. An actual bad thing they could do. You seem very certain there’s an obvious horrible problem here, so it should be easy to explain.
Where this doesn't add up for me is this - would p3@dos even be using iCloud in the first place? I guess we're about to find out.
Your are silly indeed : https://www.apple.com/child-safety/...s_for_Children_Frequently_Asked_Questions.pdf is an FAQ not the technical data go do your homeworkThe second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images.
Here is a quote directly from Apple on the front page of this site, as you can see they will get info from photos that get tagged, and as I explained before you have to be pretty blind if you don’t think they won’t have tons of false positives.
So yea reading official documentation before posting on a site is pretty silly.
“Using new applications of cryptography, Apple is able to use these hashes to learn only about iCloud Photos accounts that are storing collections of photos that match to these known CSAM images, and is then only able to learn about photos that are known CSAM, without learning about or seeing any other photos”Your are silly indeed : https://www.apple.com/child-safety/...s_for_Children_Frequently_Asked_Questions.pdf is an FAQ not the technical data go do your homework
I have to agree I am skeptical of the low false positives claims from Apple. In addition Google has been using machine learning for awhile now and we know that it does incorrectly flagging content on a regular basis. Even if Apple is correct that their implementation has a low false positive rate that is still going to be a lot of customers considering how many customers they have and the amount of photos taken.And how do you know it will be low false positives? Cause Apple has such a great track record when it comes to ai? You have no way to base that while I’m using real life evidence.
I have mostly said that for those who think to go to Android will solve the problem.
And the face they will do when they will see the update time of their hardware (if update they have), I will pay to see that.
edit : for those who doesn't know if you exclude pixel from google.
I nerver had a smartphone with android who had more than 2 different version of android and NEVER at the launch time.
Yeah it's possible to push privacy at max but you loose security in the process.It's possible for privacy freaks to go completely off big tech Apple, Microsoft and Google with Linux and for phones custom roms work the likes of e-foundation, Graphene os, Lineage os, Calyx os etc etc all Google free
Roms from Graphene os and Calyx os are locked only Lineage os that is unlocked.Android ROM means unlocked boot loader mean very easy for a malware to infect it.
it's the same with jailbreak and android root trough.
And like I said android device is never updated as long as an iPhone
maybe but you will never have the same update time as apple in the end (or even pixel if you prefer google), so in the end you lose some update in the process.Roms from Graphene os and Calyx os are locked only Lineage os that is unlocked.
Yeah, I'm going to go on a limb here and say that more than a few of us here know how the neural engine works. I'd also go on a limb here and say that more then a few of us have jobs where we would have to manage teams that would have to implement features like this as dictated by the business side of the house. Part of that job is make sure you create a feature/product/service that hits it's intended purpose but does not allow the same item to be abused. Once again you are looking at this in the most direct fashion and not looking at the big picture.I understand the privacy implication I'm just bored of all stupidity I read ... the only thing apple have to explain is how they ensure the hash list isn't modified by a third party between CSAM and our iPhone the rest are just speculation and misunderstanding.
Because I pretty sure none of your have even read how the neural engine work or how they can blur hash even work.
But yeah you know more ... I have my dose of stupidity for this week good week to you.
You know what think what you think bash apple, and during this time adult will solve the problem and it will be in iOS 15 because only 1 or 2 % of privacy freak will be upset by this.Yeah, I'm going to go on a limb here and say that more than a few of us here know how the neural engine works. I'd also go on a limb here and say that more then a few of us have jobs where we would have to manage teams that would have to implement features like this as dictated by the business side of the house. Part of that job is make sure you create a feature/product/service that hits it's intended purpose but does not allow the same item to be abused. Once again you are looking at this in the most direct fashion and not looking at the big picture.
I can help you understand this a little bit further.
You are a "dev". You are thinking like a jr dev though... think outside of the immediate problem and think of the way the feature could be abused. That is 101 stuff right there... you should know better then that.