I don’t see anyone complaining about Google’s program that’s a bit more invasive
To be fair you sort of expect that out of Google not Apple.I don’t see anyone complaining about Google’s program that’s a bit more invasive
They just have to make a good faith effort.And yet they don't. They will permit up to 29 child porn images to be uploaded and stored on their servers before they take action. If they were following the law they would prevent any of it being stored and report the uploader immediately to law enforcement.
That assumes that it will be limited to images. Eventually the system will not look just at pictures, but also at App activity that cannot be seen from the data in iCloud.Exactly. If China or Russia wants to do surveillance they’ll just directly decrypt and look at someone’s entire iCloud data because they already have access to the servers in their country and not have rely on this convoluted system to report suspected matched images.
Evidenced by?That assumes that it will be limited to images. Eventually the system will not look just at pictures, but also at App activity that cannot be seen from the data in iCloud.
True. But the complaints ring hollow for only pointing out one company’s compliance with a law versus anotherTo be fair you sort of expect that out of Google not Apple.
So I suspect the debate is going to be, do people like Apple's or Google's version of complying with the CSAM laws in regards to cloud storage.True. But the complaints ring hollow for only pointing out one company’s compliance with a law versus another
1) Where did you get your statistics then. The idea that 70% of women and 50% of men were sexually abused as children sounds absolutely insane to me.I wasn’t referring to only America. Also even then when looking at American statistics it’s quite known that they don’t include minorities well.
Abuse is subjective, if you ask most millennials, gen x, silent generation and baby boomers they will say yes they endured some abuse from their parents wether it’s physically,emotionally,sexually or verbally. So no we cannot say most people don’t abuse children.
And with your last 2 points I agree with that’s why I said this whole thing is very iffy to me. But I also said I can see why people are ok with it.
Its about spying on private devices by Apple for now. Apple scanning their servers would not be the same problem.So I suspect the debate is going to be, do people like Apple's or Google's version of complying with the CSAM laws in regards to cloud storage.
Abuse is subjective,
It sounds like a bunch of people are just mad at legislation passed in 2018. Y’all can be mad if you want, but it’s a tad late to be upset about it.
They get an image tagged "this is child porn" and they still store it on their servers. Sounds like they are not making any effort at all. Courts will likely agree that they are not making a good faith effort when they ignore the blatant tagging. What they should be doing is forwarding the image with proper identifying info to the proper authorities in the jurisdiction of the uploader and then deleting it.They just have to make a good faith effort.
Required to remove child pornography doesn't imply must actively search for it.False. See FOSTA-SESTA legislation
From the legal analysis:
“Additionally, FOSTA-SESTAputs further conditions on the applicability of 230, and platforms very much are legally required to remove child pornography.”
![]()
What Section 230 Is and Does -- Yet Another Explanation of One of the Internet's Most Important Laws
Section 230 of the Communications Decency Act immunizes internet platforms from any liability as a publisher or speaker for third-partywww.publicknowledge.org
Saying I was unaware of child porn on my personal storage systems doesn't get me off the hook for possession. Same should apply to Apple on their storage. Particularly when they are notified that the image is child porn when it is uploaded to storage they own.Required to remove child pornography doesn't imply must actively search for it.
The federal statues governing child pornography explicitly says providers (such as Apple) don't have to actively search for it. They do have to report it when they become aware.
We must be reading different comments. I'm on page 12 of 25 and haven't read a single comment about kids bath pics yet. I did read many of those on previous articles on this subject, but not in comments going with this article.If you read the comments in this thread, it's clear that people still don't understand how it works. So yeah, there is confusion.
I'm on page 3 of 11 and there are at least 5 responses from people who could not have read / understood the article... And this has been going on for days. So yeah, "confusion" fits.
I reckon that by the time i get to the end of the comments, someone will mention, again, 'what about my kids bath pics?'
I think the fact that Apple has been transparent about this and told us is very reassuring. They could have chosen not to.
Obviously there is no evidence yet of something that hasn't been made yet. Now that pandora's box has been opened, nothing will stop state actors from demanding such access.Evidenced by?
It is according to the law. So it looks like Apple is being forced to comply with going after CSAM content on their iCloud network in order to keep their section 230 immunity.
Wrong again. Encryption is broken at the point the iPhone sends photos to iCloud, and then to employees for investigation.
It’s not that difficult. In 2018, when this legislation was being debated, most tech sites like The Verge warned this would happen. You can’t blame Apple for seeing how courts are interpreting FOSTA-SESTA and becoming compliant before they are found liable as well
How would you use the CSAM detection to do this?
Please provide details since you seem so sure it's easy.
Has anybody stated how big the database of image fingerprints stored on my iPhone is going to be. If child porn is such a huge problems the database is likely very large and may take up a significant chunk of my available storage.
If they told us at WWDC or now makes no difference to transparency. Nobody is running a public release version of iOS 15 yet. The system isn’t live at the moment in beta versions of iOS 15. You have an informed choice to not install the system update or of course turn off iCloud Photo Library.Transparent?
Being transparent would have been discussing this at WWDC and working with independent agencies and security research groups to come up with best practices.
If they want or need to scan for this type of content by law, they need to do it on their servers like everyone else does.
iCloud Photo Library is not E2EE and they've made no commitments to do that.
Scan on the server please.
Yeah, reading comments here can feel that “broad public support”.Crisis… lol. A stumble in messaging isn’t a crisis. Especially for an issue that has broad public support.