Oh God! Don’t just delay it. CANCEL THIS, Apple. Can’t you see… people won’t be ordering the new iPhone 13 if you launch this child safety crap.
Maybe they are consigning it to a slow death?
Hope.
Oh God! Don’t just delay it. CANCEL THIS, Apple. Can’t you see… people won’t be ordering the new iPhone 13 if you launch this child safety crap.
They’re not. They will still release the god damn thing. They’re hoping the talk on this will die down.I don't think Apple are changing their mind on this. More like not having a hot debate cannibalize impending sales of its about to launch iPhone 13.
Just going to wait until everyone forgets and do a quite release a .1 update with some security enhancement etc..
Nope. Can you imagine the $h1tstorm that would ensue if they implemented this in a sneaky, secretive way???Just going to wait until everyone forgets and do a quite release a .1 update with some security enhancement etc..
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
Sounds like you're the one wetting him/herself.Perhaps now the needless shoe peeing by people who have no idea what they are yelling about, other than "IT'S A SLIPPERY SLOPE......" will calm down, or at least find something else in their lives to wet themselves over.
Because they weren’t victimized before Apple, get real 🙄🙄🙄🙄I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
You'd probably be surprised how careless these scumbags are when it comes to the collections of CSAM they hoard. Most of them don't really believe they're doing anything morally wrong, especially the ones who are "just collecting pictures off the internet."As a father, I share your contempt for anyone that would abuse or exploit children, but comments like this assume most of those scumbags are naive enough to store and share that material in ways that would make it readily detectable by something like CSAM. I believe most are not breathing a sigh of relief simply because this CSAM scheme was never a serious threat to what they do.
From the beginning, I've believed that this is the first step toward Apple improving end-to-end encryption in the cloud, and if the entire iCloud Photo Library were to be end-to-end encrypted, then it would be impossible for Apple to scan for CSAM anywhere else but on your device. That's a trade-off I'd be more than happy to make.As I’ve said in related threads, I’m fine with Apple scanning whatever we upload to their cloud. Just don’t perform ANY portion of the scanning/verification process on my device.
Good. Good. At least Apple is listening. Some of its policies regarding the App Store are boneheaded and its public relations person needs to shown the door because he or she did a terrible job on this.
But at least it is listening to feedback.
It is true for me...Just not true. Not true whatsoever
I hope they at least keep the parental control part where parents can have it prescreen messages sent to their young kids. That might even be useful for adults who don't want unsolicited imagery. It's just the CSAM part that's controversial.
The same thing they were planning on doing (matching hashes with known databases of CSAM), but on icloud servers only. It’s the “on-device” part that’s concerning most people from what I’ve seen, including myself.So what should they do?
Exactly this, and as I mentioned in a previous post, catching these morons often helps law enforcement actually nail the scumbags who are creating this stuff.While I agree with what you’ve said about server-side scanning, what you said about people sharing CSAM is not accurate. The initial makers and distributers of CSAM may be savvy enough to avoid being caught, but there are plenty of morons among the consumers that spread it around using Facebook, Whatsapp, Google, etc. It’s a huge problem on all social media and cloud networks, and FB alone has millions of cases to investigate per year.
True, except that it won't be available for adults, at least not in its current form. It can only be applied to users who are under 18 years of age and part of a Family Sharing group where the parents have opted into the feature.I hope they at least keep the parental control part where parents can have it prescreen messages sent to their young kids. That might even be useful for adults who don't want unsolicited imagery.