Tell that to Jennifer Lawrence.Morality aside, if this person is so stupid as to have these images on their phone and let them be uploaded to the cloud, they are too dumb to bother protecting.
Tell that to Jennifer Lawrence.Morality aside, if this person is so stupid as to have these images on their phone and let them be uploaded to the cloud, they are too dumb to bother protecting.
She was a huge celebrity, uploading her nude pictures to the cloud, and not having 2 factor authentication on.Tell that to Jennifer Lawrence.
You say that like it's a good thing. There are ways to scan and combat without breaking encryption.Pretty sure they're not reporting to law enforcement. Don't think we have seen any reports out. At this point, it's a false accusation.
The notch is here to stay lol It doesn't even bother the consumers anymore. I think people gave up and got used to it. Apple can't figure it outI miss the old days when we were arguing about the notch and face id. This is a bad apple.
Bunch of ******** to hide the fact they will scan your photos and messages, you have to be stupid to believe it will only for children between 0-12yo.
Read up on how the scan works. It's looking for known, circulated child pornography images, not scanning for every nipple and dick it can find.I look forward to the millions of teenagers being arrested on child pornography charges. Finish turning the high schools onto prisons.
Are you telling me you are OK with Apple snooping inside your iPhone and looking at your wife pictures?Maybe one could explain to me what kind of photos are you feared to show to Apple? I know, I know, it will heavily rain those red thumbs down scores. But maybe you can share your concerns with REAL examples.
In my world: I never had and never and know no one off of my friends, relatives and parents who ever had taken at any time naked pictures of them selfes, nor do I wish to have contact with those.
The guy just said the hashes are coded into iOS15, and it is being pushed to all iPhones worldwide. The scanning is still done locally, but the phone won't transmit any matches to Apple unless you have US iCloud turned on, for now. So imo this is still a scandal for Apple to force a US-centric policy to everyone in the world, as the hashes and scanning are coded into iOS.No. They scan an iPhone if set up as a 0-12 in a family. Or when you use iCloud storage for the photo app AND are in the USA.
turn off iCloud and they don’t CSAM. But, if you upload to google or Dropbox, guess what they do?
So it was ok for people to access, sell, and distribute them because Apple messed up security, people really wanted to see her naked, or she didn't declare them private enough by having a password?She was a huge celebrity, uploading her nude pictures to the cloud, and not having 2 factor authentication on.
that was a huge blunder for Apple, but that was easily avoidable.
Apparently you don't understand the concept of principles, in this case the principle of privacy. Just because you don't have anything to hide doesn't mean you should be ok with others seeing/knowing your thoughts/files/photos/locations.Maybe one could explain to me what kind of photos are you feared to show to Apple? I know, I know, it will heavily rain those red thumbs down scores. But maybe you can share your concerns with REAL examples.
In my world: I never had and never and know no one off of my friends, relatives and parents who ever had taken at any time naked pictures of them selfes, nor do I wish to have contact with those.
This is a dumb argument. If you have nothing to hide then why not let authorities come do weekly inspections of your house or frisk you on the street or search your car? Why not get audited every day? Innocence and privacy are not mutually exclusive. While I feel that people are blowing this thing out of proportion (slightly, it does open the door to bad stuff in the future ), yours is far and away the worse argument.Maybe one could explain to me what kind of photos are you feared to show to Apple? I know, I know, it will heavily rain those red thumbs down scores. But maybe you can share your concerns with REAL examples.
In my world: I never had and never and know no one off of my friends, relatives and parents who ever had taken at any time naked pictures of them selfes, nor do I wish to have contact with those.
I disagree. The fact Apple have had to go on the backfoot shows how foolish it was to proceed at all. They have blotted their copybook and whatever they say, whatever is suggested, this still comes down to SURVEILANCE and PRIVACY, a mainstay that has served Apple well in protecting against invasion of privacy and surveillance, but now it has shot itself in the foot so badly that even current court cases may be affected as they have virtually given their opposition a bat to hit themselves with and I can see companies and some individuals lining up to make the most of it, including Epic, Facebook, Elon etc. etc.If Apple had lead with this interview, I think a lot of peoples' concerns would have been laid to rest from the very start.
Already did. But from the other end, as the images circulate and proliferate across the web, they get added to the database. The biggest distributors of child porn are kids themselves. The content they entertain each other with is adults problem to live in the world with, as we try to “protect” them from their own sexuality. Ridiculous. Hopefully this backfires in an appropriately extreme way.Read up on how the scan works. It's looking for known, circulated child pornography images, not scanning for every nipple and dick it can find.
Well, it's simpler than you think, I expect. Apple is liable if they store collections of illegal content in the Apple iCloud system, I suspect. Sure, they have a user agreement that says they are not responsible for what users store there, and users agree not to store illegal content there, blah blah. They still get served with dozens of warrants to get data out of iCloud. They do not have the keys to decrypt on phone data (Apple's says, and it's probably true). But we've seen that is not the case with iCloud backups. So I'd guess this move is to try and cut off all the child imagery warrants. If they can show a chain of blocking or detecting illegal child imagery on users devices before it goes to iCloud (or at least flagging those accounts), then poof, hopefully for them a whole class of warrant requests drops off.I have not broken a single law. Matter of fact, I work with law enforcement. (Police Department)
I just don't want Apple scanning my iPhone and going through my privacy. This is not the Apple we know. Apple is up to something.
More like "We are holier than thou."Long winded responses that boil down to, “Apple no longer respects your privacy.”
From the creators of "we believe it's a fundamental human right".The worst argument ever when it comes to privacy.