What's even more interesting is this new system hasn't even been implemented yet to show any wrong doing so to come up with non-facts is quite silly.It's interesting isn't it...that a "crime" would be reported even if the picture that was syncing was a completely innocent picture of my baby daughter crawling around without anything on!
If me having to make this small concession in privacy protects even one adolescent from sexual exploitation, then it will have been worth it.
That's not true either. If one traffics in CSAM and uploads to icloud your photos will be hashed and compared to a database (with some threshold) that is taking up space on your phone as I understand it. The algorithm doesn't do facial recognition. And the treatment of PII hasn't changed.
So if you (the global you, not you personally) want to be exempt from scanning:
- log out of icloud
- ditch iphones/ipads
- don't engage in the trafficking of CSAM.
What's even more interesting is this new system hasn't even been implemented yet to show any wrong doing so to come up with non-facts is quite silly.
That isn't a positive move and would negate your conviction based stance as a whole.At this point, might as well buy a phone or computer from Huawei. At least they don't try to lie that they're selling your information out to the government.
I submit that the people using those programs probably don't really care about their online privacy to begin with, or give it any real thought; or worse, try an justify using programs that exploit them because others use the program etc.Do people seriously not know that your gmail, yahoo mail, facebook, insta, etc. are all scanned already?
They do NOT scan on local level. I have medical data and photos stored locally on my phone which are not on cloud.Do people seriously not know that your gmail, yahoo mail, facebook, insta, etc. are all scanned already?
Great post. And I'm sure true for many of us here, including that many of us are the "tech advisors" to many of our friends and family.No Mr. Cook. May you introduce that parental control, that I do not have switched on due to lack of children, but you will not check my photos "on device" for possible illicit images. There are no "illicit images". I know, you think I am a liar and would like to check it, but I will not allow that because I am not a liar. I am just a humble customer of your company - if I may...
So no iOS 15 here.
Oh, EDIT: I do have quite a wide variety of products of your company in use, which may disminish, but I am aware, that this not really interesting you. There are also quite a lot of people, frequently consulting me about technical things, they do not understand. I think, this is not interesting you as well...
So be it.
For now it is. Once again how does a user verify that the CSAM software is not running on the device when iCloud Photos are disabled? All we have is Apple's word that is the problem.Game over bashers, Apple has just confirmed to TechCrunch that if iCloud Photos is disabled the scan won’t even run.
For now it is. Once again how does a user verify that the CSAM software is not running one the device when iCloud Photos are disabled? All we have is Apple's word that is the problem.
The problem is Apple choose to install surveillance software on devices without consent and created this problem. The user only has Apple's word on what the software does and their is no way to verify it. Mind you no other company installs this type of software on your device.How do you verify anything about a closed-source OS?
Game over bashers????? 🤣 🤣 😅Should I assume you are having a party because your team "won" ?Game over bashers, Apple has just confirmed to TechCrunch that if iCloud Photos is disabled the scan won’t even run.
You will give Apple consent by installing iOS 15 and enabling iCloud.The problem is Apple choose to install surveillance software on devices without consent and created this problem. The user only has Apple's word on what the software does and their is no way to verify it. Mind you no other company installs this type of software on your device.
Right. Us and 7,000 other people including some of the world's foremost privacy and crypto experts who are as alarmed as we average users are. https://appleprivacyletter.comThen you're overreacting.