Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You need to study the law. Apple is required by law to make a good faith effort to keep illegal content off their platform to keep their immunity intact under sec 230 of the CDA. Also, your on-device intelligence ALREADY scans your photos for object detection. They just added illegal content to the scanning parameters

Completely false. There is no such provision in 230 requiring  to police private content. If that were true, they would have to scan everything from email to the Address Book.
 
Isn't it just grand how misinformed people are that they are now making laws up on the fly to defend their beloved Apple?
That user is wrong. See FOSTA-SESTA clarification on 230 immunity:

Additionally, FOSTA-SESTAputs further conditions on the applicability of 230, and platforms very much are legally required to remove child pornography.

 
Completely false. There is no such provision in 230 requiring  to police private content. If that were true, they would have to scan everything from email to the Address Book.
Quit spewing your false garbage. Read this legal analysis of 230, that includes FOSTA-SESTA’s clarification on child pornography

 
“Degrade encryption”. That’s false. End to end encryption isn’t broken at any point. They use on device intelligence to keep encryption intact

Wrong again. Encryption is broken at the point the iPhone sends photos to iCloud, and then to  employees for investigation.
 
  • Like
Reactions: huge_apple_fangirl
People say they want to protect the children from child abuse

What happens when they want to protect the children from gays and lesbians?
What happens when they want to protect the children from African-Americans or Chinese?
What happens when they want to protect the children from Muslims, or Jews, or Catholic?
What happens when they want to protect the children from art and music?

Do you see the slippery slope?

The reason I’m most appalled is that I trusted Apple with my privacy. I won’t give them the chance in the future.

How would you use the CSAM detection to do this?
Please provide details since you seem so sure it's easy.
 
Wrong again. Encryption is broken at the point the iPhone sends photos to iCloud, and then to  employees for investigation.
photos being uploaded to iCloud aren’t end to end encrypted to begin with. You can’t break what isn’t there. I was speaking about messages
 
photos being uploaded to iCloud aren’t encrypted to begin with. You can’t break what isn’t there. I was speaking about messages

Makes no difference. That encryption method is not secure.

You don't create the key.

Data can be recovered without the key.
 
Makes no difference. That encryption method is not secure.

You don't create the key.

Data can be recovered without the key.
I’m not arguing security. Someone made the claim that apple is weakening encryption. That is false. They aren’t altering any encryption whatsoever
 
  • Like
Reactions: VTECaddict
Sorry, what part am i wrong;

apple is going to use a software method to review my images

if my images = illegal notify government sanctioned organization

therefore my phone becomes a government surveillance device

The system is not designed to catch illegal images. It's incredible bad at catching most illegal images out there.
The scanning is only done if you have decided to store those images on Apple's property (iCloud).
 
Wrong. Anyone who doesn’t make a good faith effort to inform users about illegal content can be stripped of their liability shield. Turning a blind eye is a form of encouragement. There’s already precedent for this

Show me the precedent.
 
Wrong. Did you even bother to read this link? It clearly says courts have trended toward exceptions to 230 only when the provider “induced” or “encouraged” the illegal content.
If Apple is doing this to keep their immunity they should have announced that fact and with these changes Apple needs to make iCloud Photos storage opt in not opt out.
 
Well I don´t like the perceived privacy invasion, and what they can/will do in the future. Whilst I think child abusers should be executed, I don´t like the new situation, which is almost guilty until proved innocent. I have put up with iOS and iPad OS restrictions (some of which are commercial rather than security related), as I thought iOS was much more secure than Android. Next week I will be attempting to install /e/OS on an Android phone to test. If its good, I will not be getting an iPhone 13 Max (had 5,5S,6 Plus, 6S Plus, 8 Plus, XS Max (Last 4 top of the range ram), I was also looking at getting a larger iMac M2 when released, to replace my 27 iMac, but am unlikely to do that now.

I was all in at one point - iPhones, Apple watches, iPads, Apple TVs, iMacs etc. Really gone off Apple now. I can understand Apple protecting themselves, by not allowing questionable material on iCloud, but to snoop on customer devices. not good.
 
  • Like
Reactions: BurgDog
  • Like
Reactions: MozMan68
The FOSTA-SESTA legislation made clear that content hosts MUST screen for child pornography

see this link for explanation:


The SESTA amendment to 230 specifically uses these words: it illegal to knowingly assist, facilitate, or support.

If you don’t look at your customers’ photos, you can’t KNOWINGLY be doing anything wrong.
 
The SESTA amendment to 230 specifically uses these words: it illegal to knowingly assist, facilitate, or support.

If you don’t look at your customers’ photos, you can’t KNOWINGLY be doing anything wrong.
The problem is if Apple turns a blind eye to it they can be held legally liable. I do not like the fact that the scanning is done on device but it is Apple's legal team that is most likely protecting the company's shareholders from a lawsuit that hurts their investment.
 
The problem is if Apple turns a blind eye to it they can be held legally liable. I do not like the fact that the scanning is done on device but it is Apple's legal team that most likely protecting the company's shareholders from a lawsuit that hurts their investment.
Correct. Courts are interpreting FOSTA-SESTA to require content hosts to make good faith efforts to scan for child pornography
 
  • Like
Reactions: MozMan68
please keep screaming. this has to NOT GO AWAY. however long it takes. if it takes to iOS16 or infinity and beyond.
...
I am not trying to scream. I am trying first to understand, and Apple's explanations have been very glib and abstract. I do think, however, that even if their system works as advertised they have truly opened Pandora's Box. Sometimes a demonstration of feasibility is all it takes to convince authoritarian governments to abuse technology. If Apple's systems is a secure as they claim, what prevents governments from coercing Apple and others for operating systems that detect other forms of content (flags, memes, faces of gay people, etc.)? Nothing, regrettably.

One more thing, I think both in the US and the UK anti-privacy legislation has gotten a free ride since 9/11. The issue not just about Apple but surveillance per se in a world where mobile device will have enough computing power to have rudimentary AI, and sophisticated AI when linked to other machines online. It's time we evaluate whether our laws are strong enough to protect privacy. IMO they aren't.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.