Dear Apple,
I’m not a criminal. I resent that iPhone is now designed to monitor me like a criminal.
I’m not a criminal. I resent that iPhone is now designed to monitor me like a criminal.
Too late, if you've ever used any modern device in any way whatsoever.I totally read understood what they say about how it works, but I will NEVER allow anyone to scan my data and for sure I will NOT PAY anyone to be able to do so.
So you are saying we need to be louder until they stop doing the invasion they are already doing because clearly, they want more invasive control.Too late, if you've ever used any modern device in any way whatsoever.
No, I'm saying all of your photos have been scanned since day 1 of iPhone. Granted, they were scanned for different reasons (applying effects, indexing, etc.), but they've been scanned since day 1.So you are saying we need to be louder until they stop doing the invasion they are already doing because clearly, they want more invasive control.
If the shoe fits 👠Stop saying we don't understand the issue and Craig explains it all in this video and we're all being emotional children. At least we're not naive sheep. It's you who is misunderstanding the fundamental issue. Stop drinking Apple's BS-flavored Coolaid.
This sounds like a slippery slope fallacy.I've watched the video. It's complete white-washed BS that entirely skirts the fundamental problem with this technology, which is... There is nothing that prevents someone/anyone with the clout of strong-arming Apple to replace the subset of images they are scanning for with an entirely different set of criteria for a completely different purpose. What they're putting in place is the groundwork for any government agency to scan for any content they like.
Oh nice you admit it’s literally a slippery slope fallacy. Nice arguing with ya 👋The "auditing" and "transparency" schtick is complete crap. Who are the auditors and who audits them? There's one simple solution... Do not put this system in place, for any reason, no matter how noble. It's a slippery-slope down a razor-blade slide into a pool of alcohol.
And that's clearly wrong, so it needs to stop. We need to make this a criminal offense to discourage other companies from doing it. The goal at this point should be more privacy for the user.No, I'm saying all of your photos have been scanned since day 1 of iPhone. Granted, they were scanned for different reasons (applying effects, indexing, etc.), but they've been scanned since day 1.
You think it's clearly wrong for your phone to process the image sensor data into a pleasing photo that you are able to view on your phone? How is that a privacy issue?And that's clearly wrong, so it needs to stop. We need to make this a criminal offense to discourage other companies from doing it. The goal at this point should be more privacy for the user.
What options do you have?I'm currently on my 4th iPhone (had a 3G, 4S, 6 and X) and was planning to pick up a 13 in the fall. But this is making me think twice. As the owner of the device I paid a lot of money for, I decide what spyware runs (or doesn't run) on it. End of story.
We empirically assessed NeuralHash performance by matching 100 million non-CSAM photographs against the perceptual hash database created from NCMEC’s CSAM col- lection, obtaining a total of 3 false positives, as verified by human inspection. Separate- ly, we assessed NeuralHash on an adult pornography dataset of about 500,000 where we observed zero false positives against the perceptual hash database. The threat model explicitly takes into account the possibility of NeuralHash image-level false posi- tives by imposing a required threshold of simultaneous matches for a given account be- fore Apple's iCloud Photos servers can decrypt any vouchers.
It is if anyone but me has access to the data associated with them. So if you want to make my teeth whiter then get my permission to do it and limit the behavior to what was requested by me. But that whiter smile should never be shared, in any way, until I specifically share it.You think it's clearly wrong for your phone to process the image sensor data into a pleasing photo that you are able to view on your phone? How is that a privacy issue?
I’m wiser now!I am not defending Apple on this one, but how are you better off with Windows and Android privacy-wise?
I think they admitted to having a stash of more than 30 child porn images.So apple’s got a stash of half a million porn pics, yet won’t allow nudity on Apple TV. Cheeky Tim Apple.
For the sake of this comment, I'm going to assume this feature won't be in iOS 14. If I'm wrong on that, that will invalidate my next statement.Apple said at WWDC they were supporting both iOS 14 and 15. Wonder if they knew back then they’d have a mess on their hands when this came out.
Mass surveillance by left winged Apple, Inc.
Let me help with an analogy:Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
Oh, you're responding to a different point altogether. Admittedly, I was being pedantic with So@So@So.It is if anyone but me has access to the data associated with them. So if you want to make my teeth whiter then get my permission to do it and limit the behavior to what was requested by me. But that whiter smile should never be shared, in any way, until I specifically share it.
And I think we all deserve to know who twisted their arm.The irrefutable fact is they're building a back door into their OS... and by the wording of a prior interview, they've been arm twisted by higher powers to do so.
So you think there may be a lawsuit about this? Because it appears that Apple does have the right?Let me help with an analogy:
I don't see why it's a problem if a police officer follows you around and is with you 24/7 but only acts if you try to commit a crime and only produces a police report if you have committed at least 30 of those crimes.
You own the device. Apple doesn't have a right to actively scan it on the hardware you own.
No, I'm saying all of your photos have been scanned since day 1 of iPhone. Granted, they were scanned for different reasons (applying effects, indexing, etc.), but they've been scanned since day 1.
Let me give you a better analogy:Let me help with an analogy:
I don't see why it's a problem if a police officer follows you around and is with you 24/7 but only acts if you try to commit a crime and only produces a police report if you have committed at least 30 of those crimes.
You own the device. Apple doesn't have a right to actively scan it on the hardware you own.