O M GIf you can’t even implement basic age filters then maybe don’t claim you’re so important you deserve 30% of everyone else’s income.
O M GIf you can’t even implement basic age filters then maybe don’t claim you’re so important you deserve 30% of everyone else’s income.
This whine makes no sense at all.Not just privacy. Cost. Apple has unilaterally decided to use its customers resources to perform checks on themselves. Checks which Apple hasn’t been doing for years even though they could because they didn’t want to pay the bill.
Oh, wait, I just tested it and if you do this...MacRumors had an article how that doesn't really work as intended, here let me give you the link:
Oh, wait, it's the article you're commenting, whoops.![]()
Watchdog Investigation Finds 'Major Weaknesses' in Apple's App Store Child Safety Measures
The non-profit watchdog group Campaign for Accountability today released a report revealing "major weaknesses" in Apple's App Store child...www.macrumors.com
Sorry, they actually started hash checks in 2008.
So that’s 13 years of Apple refusing to participate in checking for CSAM because they didn’t want to pay the price themselves.
Now they can pass on the cost they’re acting like they really care.
My question still remains.
Huh? So who's been paying all the Apple employees and those of other companies to develop and implement this CSAM detection technology for iOS 15?
Apple is ONLY interested in CSAM technology which also allows them to scan a user’s personal files for arbitrary signatures.
If they really only cared about checking iCloud photos for known CSAM then they could have implemented that at any point in the last 13 years. But they didn’t.
Apple is ONLY interested in CSAM technology which also allows them to scan a user’s personal files for arbitrary signatures.
If they really only cared about checking iCloud photos for known CSAM then they could have implemented that at any point in the last 13 years. But they didn’t.
Maybe they did. This is a thought I wouldn't have entertained a month ago. How times change.And if Apple were really interested in snagging user personal files, they could have secretly implemented that years ago.
Maybe they did. This is a thought I wouldn't have entertained a month ago. How times change.
You do you understand that with trivial modifications the hashes could be of anything applied to any file , right?...The only files being scanned are photos the user is syncing with iCloud and the "signatures" are not arbitrary, but a database of known CSAM hashes. ...
You have no idea what you're talking about. That's not at all what's going on with Apple's CSAM detection. The only files being scanned are photos the user is syncing with iCloud and the "signatures" are not arbitrary, but a database of known CSAM hashes. Additionally, since Apple has no access to your device, they ONLY know about images uploaded to iCloud (and then only detected CSAM . . . and THEN only if there's at least 30 of those detected images). They have already told you precisely what is happening. If you're calling them liars, then present your evidence to prove that.
And again, your logic is whack. You're basically saying that unless someone does everything right from day 1, then their motives are wrong if they start doing the right thing in the future. First of all, that defies all rules of logic. Secondly, my understanding is that one of the main reasons Apple has resisted scanning for CSAM on the cloud previously was precisely because they put such a high value on privacy - now they've come up with a way to maintain that committment to privacy while also still being able to detect illegal imagery being uploaded to their servers in violation of both their policies and federal law. If you think there's more to it then that, then don't just throw wild theories out there - present some hard evidence.
Use the damn parental controls on your kids iPhone. WTF is so hard about that?So, to summarise: Apple is planning on doing invasive surveillance on your iPhone on the basis of their (1) technical competence, (2) their concern for children, and (3) trust regarding their motives and moral resolve, yet they can't manage to keep adult-rated apps, from which they make profits, out of kids' hands who are declared to be 14-year-olds?
The vetting of apps in the App store has been an neglected aspect of Apple's operations for years, yet they chose to develop spyware (IMO) integral to iOS using AI. Well, how about applying AI to sort out app vetting instead??? Apple is sinking lower in my estimation daily.
"If Apple already knows that a user is under 18, how can it let the user download adult apps in the first place?"
Satire. *cough*LOL! Maybe they were also running a secret child sex slave trade. I mean it's pOsSiBlE, right? Again, present your evidence. If you don't have any, then I'd suggest keeping your wild theories to yourself, as each verbalized one makes you less and less credible to people who value rationality over conspiracy theories and fear-mongering.
Does Apple need parental controls to avoid selling a 18+ rated app to a self-declared 14-year-old? I would have thought they would understand computing functions related to basic inequalities.Use the damn parental controls on your kids iPhone. WTF is so hard about that?
And if Apple were really interested in snagging user personal files, they could have secretly implemented that years ago.
Does Apple need parental controls to avoid selling a 18+ rated app to a self-declared 14-year-old? I would have thought they would understand computing functions related to basic inequalities.
Use...parental...control...on...your...kid's...iPhone.Does Apple need parental controls to avoid selling a 18+ rated app to a self-declared 14-year-old? I would have thought they would understand computing functions related to basic inequalities.
Use...parental...control...on...your...kid's...iPhone.