Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Wrong.


Wrong. Did you even bother to read this link? It clearly says courts have trended toward exceptions to 230 only when the provider “induced” or “encouraged” the illegal content.
 
Do I agree with csam in general? I’m iffy on the whole thing but it’s not surprising at all that people are ok with it.

The list of things that are so awful is so long..

I just don't fundamentally believe in sacrificing personal privacy on device in this way -- for everyone -- in hopes of maybe catching some bad things.

That's a game this is literally never won, short of a totalitarian state.
 
  • Like
Reactions: Mega ST
Wrong. Did you even bother to read this link? It clearly says courts have trended toward exceptions to 230 only when the provider “induced” or “encouraged” the illegal content.
Wrong. Anyone who doesn’t make a good faith effort to inform users about illegal content can be stripped of their liability shield. Turning a blind eye is a form of encouragement. There’s already precedent for this
 
Child s*x abuse

Part of the reason ongoing cases of absue exist is that victims are shamed by cultural norms which shun sex. Consequently, sexual abuse goes unreported. Things like not saying what it is. "child sex abuse" is censored or replaced with "friendly" acronyms like "CSAM" which don't make people quite as uncomfortable.

As a society, we don't help when we do things like this. Call a spade a spade.

It is called child sexual abuse.
 
  • Like
Reactions: blueflame
OK so does this do an exact match or not? If it does an exact match, good luck with that as pedophiles avoid the system by changing pixels. Also if that were true, then why the need for the threshold? What Apple has released already doesn't really match Federighi's statement. It sounds like they are doing an approximate match based on perceptual features, so the question is not whether there will be false positives, or how often false positives will be flagged up, but whether false positives will always looks like the CSAM images (e.g., having certain poses, exposed skin, etc.), in which case they will be sensitive pictures of people. And thirty false positives is potentially nothing given that people often take multiple pictures of the same event/scene.

I am not confused*, I am concerned. And the concerns I have covered here apply only if the system works as planned, and is not corrupted, as it almost certainly will be, for far less noble purposes than detecting child porn. Apple has just demonstrated to every authoritarian government on the planet that their new chips plus a software framework can be used as an extension of an AI agent that can perform surveillance about virtually anything. Good job Apple. Idiots.

*Other than by seemingly self-contradictory statements Apple has made and their lack of transparency about the algorithm.
please keep screaming. this has to NOT GO AWAY. however long it takes. if it takes to iOS16 or infinity and beyond.

Apple WAS the becon of a free Digital world. where you the the richest guard dog in the world. now they are a branch of law enforcement with sights on everyone by default being sketchy. no thanks. I hope I am one of MANY MANY MANY who actually do something about it.

I know this week this happened for me:
iMessage was turned off
iCloud photos turned off
all Siri functions turned off
duckduckgo became my default browser
I lowered my TB tier to free for icloud
I signed up for proton mail and made it default
I signed up for proton VPN (turned on)
I signed up for signal

I sent link to all these apps and services to every one of my friends and family members. so far I have about 40% of those people who have at least adopted and installed.

the churn may be slow, but I am going to do everything I can do divest and encourage everyone around me to do the same. this situation is unacceptable. and I am a shareholder! it is going to be really hard to do this, and I am not happy I feel like I need to. but we need to stand up as a collective whole for the future. "won't somebody think of the kids" there is more to the future than stopping CSAM

I hope they take a hit and walk back. will I'm not going to drop my perfectly good phone, I will move hardware next time if they don't walk back.

my privacy from snooping is paramount, not because I have anything sketchy, but in principle.
 
Google does this, Samsung does this, EVERYONE DOES THIS. So people saying it degrades Apple somehow aren’t paying attention. I don’t remember this uproar when Google started their version of this program.

And as said before, because everyone else is doing it (supposedly), that makes it okay, rght? 🙄
 
Google does this, Samsung does this, EVERYONE DOES THIS. So people saying it degrades Apple somehow aren’t paying attention. I don’t remember this uproar when Google started their version of this program.

People selected Apple / iPhone because Apple said they were not doing this sort of thing.

Now Apple is doing this sort of thing and people are upset. How is this confusing to you?
 
How is law enforcement supposed to find out about illegal content if not through notification from service providers?

How is the government supposed to know if you’re doing anything illegal unless then can routinely search your house?

It’s called first getting PROBABLE CAUSE then conducting an INVESTIGATION and then getting permission from a JUDGE to violate your rights. What country do you think this is?
 
Remember how Apple of recent is trying to stop leaks coming from Apple. There's been news coverage on it. I believe this is not really about product leaks. Sure they hurt Apple but the free advertising from them also helps Apple.

I believe the leak prevention work Apple has done of recent is more about preventing leaks of the shady stuff Apple is doing, like CSAM searching. Apple do not ant us to know stuff like this till it is too late for us to stop it.
 
Wrong. Did you even bother to read this link? It clearly says courts have trended toward exceptions to 230 only when the provider “induced” or “encouraged” the illegal content.
Wrong. Anyone who doesn’t make a good faith effort to inform users about illegal content is
Wrong. Anyone who doesn’t make a good faith effort to inform users about illegal content can be stripped of their liability shield. Turning a blind eye is a form of encouragement. There’s already precedent for this

 
Child s*x abuse is HUGE, a lot bigger than people seem to think. Most (like 70%+) of women have been abused s*xually as a child, almost half of men. Child s*x trafficking is bigger than adult s*x trafficking and barely legal teen p*rn is the most popular.

Child labor is bad but it doesn’t compare at all to how bad child s*x abuse and trafficking are.

Do I agree with csam in general? I’m iffy on the whole thing but it’s not surprising at all that people are ok with it.
While the amount of children who are sexually abused is way too high (and the only acceptable number is zero), the number you quoted is waaay to high: https://www.rainn.org/statistics/children-and-teens

Secondly, most people do not abuse kids. We do not search everyone's home because some people commit crimes, you need a warrant. The same should apply to our online data.

Thirdly, the rise of the internet has made it exponentially easier to share CSAM. Whatever Apple catches with their system will unfortunately only be a drop in the bucket. Has anyone proven that cloud providers scanning everyone's photos for known CSAM actually decreases child abuse and creation of CSAM, rather than merely catching already known images? Apple has been happy to explain why their NeuralHash system makes everything OK, but they haven't explained if their system will actually decrease child abuse.

Fourthly, Apple says the threshold is 30 photos need to match. That means if Apple has 29 images that match the hash, making them extraordinarily likely to be CSAM, Apple will just leave them up on their servers? That sounds all kinds of illegal, or Apple's algorithm is truly s**t.
 
Last edited by a moderator:
How is the government supposed to know if you’re doing anything illegal unless then can routinely search your house?

It’s called first getting PROBABLE CAUSE then conducting an INVESTIGATION and then getting permission from a JUDGE to violate your rights. What country do you think this is?
You don’t have those rights on a privately owned platform
 
The list of things that are so awful is so long..

I just don't fundamentally believe in sacrificing personal privacy on device in this way -- for everyone -- in hopes of maybe catching some bad things.

That's a game this is literally never won, short of a totalitarian state.

It’s like “guilty until proven innocent.” My how Apple is stepping off the path here.

But I still want to know Who? Who at Apple came up with this bloody idea and how did this come about?
 
It’s like “guilty until proven innocent.” My how Apple is stepping off the path here.

But I still want to know Who? Who at Apple came up with this bloody idea and how did this come about?

A few others have hit on it in here.

I think they’re just trying to take the easiest way to get law-enforcement off their back

It is a little bit startling how often Apple is “surprised“ by the reaction to their moves in the last couple years. They seem to have quite the internal echo chamber.

Translation… An exceptional amount of hubris
 
  • Like
Reactions: VulchR
It’s like “guilty until proven innocent.” My how Apple is stepping off the path here.

But I still want to know Who? Who at Apple came up with this bloody idea and how did this come about?
Apple’s lawyers most likely, because they need to keep their 230 immunity intact
 
FALSE. No such law. They only have to not ENCOURAGE it or PARTICIPATE in it.
False. See FOSTA-SESTA legislation
From the legal analysis:

“Additionally, FOSTA-SESTAputs further conditions on the applicability of 230, and platforms very much are legally required to remove child pornography.”
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.