Apple had claimed to be different.
This will end in tears.
This will end in tears.
Wrong.
![]()
The Exceptions to Section 230: How Have the Courts Interpreted Section 230?
Courts have generally interpreted Section 230 of the Communications Decency Act broadly, but they have identified some limits for when the law does not provide immunity from liability.itif.org
The law requires Apple to make a good faith effort to root out child pornography. That’s to maintain 230 liability immunity. Your case would get laughed out of courtOr take Apple to court. I hope someone does this.
Do I agree with csam in general? I’m iffy on the whole thing but it’s not surprising at all that people are ok with it.
Wrong. Anyone who doesn’t make a good faith effort to inform users about illegal content can be stripped of their liability shield. Turning a blind eye is a form of encouragement. There’s already precedent for thisWrong. Did you even bother to read this link? It clearly says courts have trended toward exceptions to 230 only when the provider “induced” or “encouraged” the illegal content.
Child s*x abuse
I guess Apple's employees who are hired to review these possible CSAM images will find out if that number is high enough to prevent false positives.30 images. high enough threshold to prevent inaccurate matches
Wrong.
![]()
The Exceptions to Section 230: How Have the Courts Interpreted Section 230?
Courts have generally interpreted Section 230 of the Communications Decency Act broadly, but they have identified some limits for when the law does not provide immunity from liability.itif.org
please keep screaming. this has to NOT GO AWAY. however long it takes. if it takes to iOS16 or infinity and beyond.OK so does this do an exact match or not? If it does an exact match, good luck with that as pedophiles avoid the system by changing pixels. Also if that were true, then why the need for the threshold? What Apple has released already doesn't really match Federighi's statement. It sounds like they are doing an approximate match based on perceptual features, so the question is not whether there will be false positives, or how often false positives will be flagged up, but whether false positives will always looks like the CSAM images (e.g., having certain poses, exposed skin, etc.), in which case they will be sensitive pictures of people. And thirty false positives is potentially nothing given that people often take multiple pictures of the same event/scene.
I am not confused*, I am concerned. And the concerns I have covered here apply only if the system works as planned, and is not corrupted, as it almost certainly will be, for far less noble purposes than detecting child porn. Apple has just demonstrated to every authoritarian government on the planet that their new chips plus a software framework can be used as an extension of an AI agent that can perform surveillance about virtually anything. Good job Apple. Idiots.
*Other than by seemingly self-contradictory statements Apple has made and their lack of transparency about the algorithm.
Wrong again. The Postal Service is exempt from ALL content sent through the mail.
Google does this, Samsung does this, EVERYONE DOES THIS. So people saying it degrades Apple somehow aren’t paying attention. I don’t remember this uproar when Google started their version of this program.
Google does this, Samsung does this, EVERYONE DOES THIS. So people saying it degrades Apple somehow aren’t paying attention. I don’t remember this uproar when Google started their version of this program.
How is law enforcement supposed to find out about illegal content if not through notification from service providers?
Wrong. Anyone who doesn’t make a good faith effort to inform users about illegal content isWrong. Did you even bother to read this link? It clearly says courts have trended toward exceptions to 230 only when the provider “induced” or “encouraged” the illegal content.
Wrong. Anyone who doesn’t make a good faith effort to inform users about illegal content can be stripped of their liability shield. Turning a blind eye is a form of encouragement. There’s already precedent for this
While the amount of children who are sexually abused is way too high (and the only acceptable number is zero), the number you quoted is waaay to high: https://www.rainn.org/statistics/children-and-teensChild s*x abuse is HUGE, a lot bigger than people seem to think. Most (like 70%+) of women have been abused s*xually as a child, almost half of men. Child s*x trafficking is bigger than adult s*x trafficking and barely legal teen p*rn is the most popular.
Child labor is bad but it doesn’t compare at all to how bad child s*x abuse and trafficking are.
Do I agree with csam in general? I’m iffy on the whole thing but it’s not surprising at all that people are ok with it.
You don’t have those rights on a privately owned platformHow is the government supposed to know if you’re doing anything illegal unless then can routinely search your house?
It’s called first getting PROBABLE CAUSE then conducting an INVESTIGATION and then getting permission from a JUDGE to violate your rights. What country do you think this is?
For who? Apple or its customers?Apple had claimed to be different.
This will end in tears.
The list of things that are so awful is so long..
I just don't fundamentally believe in sacrificing personal privacy on device in this way -- for everyone -- in hopes of maybe catching some bad things.
That's a game this is literally never won, short of a totalitarian state.
It’s like “guilty until proven innocent.” My how Apple is stepping off the path here.
But I still want to know Who? Who at Apple came up with this bloody idea and how did this come about?
Just so we are clear… Apple, by law, has to make a good faith effort to keep child pornography off their platform to keep their immunity under Sec 230 of the CDA
You don’t have those rights on a privately owned platform
Apple’s lawyers most likely, because they need to keep their 230 immunity intactIt’s like “guilty until proven innocent.” My how Apple is stepping off the path here.
But I still want to know Who? Who at Apple came up with this bloody idea and how did this come about?
False. See FOSTA-SESTA legislationFALSE. No such law. They only have to not ENCOURAGE it or PARTICIPATE in it.