Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And yet they don't. They will permit up to 29 child porn images to be uploaded and stored on their servers before they take action. If they were following the law they would prevent any of it being stored and report the uploader immediately to law enforcement.
They just have to make a good faith effort.
 
Exactly. If China or Russia wants to do surveillance they’ll just directly decrypt and look at someone’s entire iCloud data because they already have access to the servers in their country and not have rely on this convoluted system to report suspected matched images.
That assumes that it will be limited to images. Eventually the system will not look just at pictures, but also at App activity that cannot be seen from the data in iCloud.
 
  • Like
Reactions: ChromeAce
I wasn’t referring to only America. Also even then when looking at American statistics it’s quite known that they don’t include minorities well.

Abuse is subjective, if you ask most millennials, gen x, silent generation and baby boomers they will say yes they endured some abuse from their parents wether it’s physically,emotionally,sexually or verbally. So no we cannot say most people don’t abuse children.

And with your last 2 points I agree with that’s why I said this whole thing is very iffy to me. But I also said I can see why people are ok with it.
1) Where did you get your statistics then. The idea that 70% of women and 50% of men were sexually abused as children sounds absolutely insane to me.

2) We are talking about child sexual abuse here, not whether an individual feels that their parents emotionally abused them. Apple isn't scanning for that anyways (yet). If you got your 70% statistic by including that, your statement was false. I'd also like to see your source for that.
 
So I suspect the debate is going to be, do people like Apple's or Google's version of complying with the CSAM laws in regards to cloud storage.
Its about spying on private devices by Apple for now. Apple scanning their servers would not be the same problem.
 
  • Like
Reactions: Schismz and Pummers
They just have to make a good faith effort.
They get an image tagged "this is child porn" and they still store it on their servers. Sounds like they are not making any effort at all. Courts will likely agree that they are not making a good faith effort when they ignore the blatant tagging. What they should be doing is forwarding the image with proper identifying info to the proper authorities in the jurisdiction of the uploader and then deleting it.
 
  • Like
Reactions: PC_tech
False. See FOSTA-SESTA legislation
From the legal analysis:

“Additionally, FOSTA-SESTAputs further conditions on the applicability of 230, and platforms very much are legally required to remove child pornography.”
Required to remove child pornography doesn't imply must actively search for it.

The federal statues governing child pornography explicitly says providers (such as Apple) don't have to actively search for it. They do have to report it when they become aware.
 
Required to remove child pornography doesn't imply must actively search for it.

The federal statues governing child pornography explicitly says providers (such as Apple) don't have to actively search for it. They do have to report it when they become aware.
Saying I was unaware of child porn on my personal storage systems doesn't get me off the hook for possession. Same should apply to Apple on their storage. Particularly when they are notified that the image is child porn when it is uploaded to storage they own.
 
  • Disagree
Reactions: dgrey
If you read the comments in this thread, it's clear that people still don't understand how it works. So yeah, there is confusion.

I'm on page 3 of 11 and there are at least 5 responses from people who could not have read / understood the article... And this has been going on for days. So yeah, "confusion" fits.

I reckon that by the time i get to the end of the comments, someone will mention, again, 'what about my kids bath pics?'
We must be reading different comments. I'm on page 12 of 25 and haven't read a single comment about kids bath pics yet. I did read many of those on previous articles on this subject, but not in comments going with this article.
 
  • Like
Reactions: PC_tech
I think the fact that Apple has been transparent about this and told us is very reassuring. They could have chosen not to.

Transparent?

Being transparent would have been discussing this at WWDC and working with independent agencies and security research groups to come up with best practices.

If they want or need to scan for this type of content by law, they need to do it on their servers like everyone else does.

iCloud Photo Library is not E2EE and they've made no commitments to do that.

Scan on the server please.
 
It is according to the law. So it looks like Apple is being forced to comply with going after CSAM content on their iCloud network in order to keep their section 230 immunity.

Actually, the opposite is true. They are only required to act if they are “knowingly” aware of illegal content. Until they started looking at it, they could not knowingly be aware of any of it, unless reported by someone. By now routinely inspecting all content, they lose their 230 immunity because now they are “knowingly” aware of ALL iCloud Photos content.
 
Wrong again. Encryption is broken at the point the iPhone sends photos to iCloud, and then to  employees for investigation.

It's not.

When the CSAM Detection system scans the photo, the photo is in an unencrypted state. There is no encryption to break.

When the system creates the safety voucher, again the photo is in an unencrypted state. The safety voucher which includes a derivative of the unencrypted photo, is encrypted with a secret key unknown to Apple and only known to the device and not the user. But every safety voucher contains part of the key necessary to unlock this encryption. When Apple has collected enough safety voucher, they have enough of the key to unencrypt all the safety vouchers and then read the pictures within them.

No encryption was broken since the encryption was employed to allow only Apple to read it. If the user or some one else can read the vouchers, then encryption was broken.

When iOS sends the regular photo to iCloud Photo Library its encrypted with two keys, one held by the user and one held by Apple. This means that both Apple and the user can read the content. Encryption is only broken if some else is able to read it.

Encryption != Only I can read it
 
Last edited:
It’s not that difficult. In 2018, when this legislation was being debated, most tech sites like The Verge warned this would happen. You can’t blame Apple for seeing how courts are interpreting FOSTA-SESTA and becoming compliant before they are found liable as well

To side with user privacy, as  has claimed to do until now, all they have to do is allow end-to-end encryption of all iCloud content by which they do not have the keys.
 
Has anybody stated how big the database of image fingerprints stored on my iPhone is going to be. If child porn is such a huge problems the database is likely very large and may take up a significant chunk of my available storage.

Even more disturbing is that all iPhone users will now be required to store hashes of child porn on phones they own.
 
Last edited:
Transparent?

Being transparent would have been discussing this at WWDC and working with independent agencies and security research groups to come up with best practices.

If they want or need to scan for this type of content by law, they need to do it on their servers like everyone else does.

iCloud Photo Library is not E2EE and they've made no commitments to do that.

Scan on the server please.
If they told us at WWDC or now makes no difference to transparency. Nobody is running a public release version of iOS 15 yet. The system isn’t live at the moment in beta versions of iOS 15. You have an informed choice to not install the system update or of course turn off iCloud Photo Library.

Telling us at WWDC wouldn’t have made one dot of difference. Now had they told after the release of iOS 15 then I’d agree with you. But they didn’t. We’re still at least a month away from release.

I think different people will have different views in regards to scanning on device or scanning on the server. I don’t necessarily disagree with you about doing scans on the server and the server alone. But I’d only want that to be the case if it was possible to do it in a way that meant Apple or others couldn’t see my entire photo library. As far as I know that technology doesn’t exist.

You either have to scan everything on the server and share all of your data with Apple. Or the alternative (and the one Apple has gone with) have your device detect CSAM using on device processing. And then share that and only that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.