That’s a PR thingIf Apple is doing this to keep their immunity they should have announced that fact and with these changes Apple needs to make iCloud Photos storage opt in not opt out.
That’s a PR thingIf Apple is doing this to keep their immunity they should have announced that fact and with these changes Apple needs to make iCloud Photos storage opt in not opt out.
Texas Supreme Court Says Facebook Can Be Held Liable For Child Sex Trafficking On Its Platform - Biometrica Systems, Inc.
By a Biometrica staffer In a move likely may have a far-reaching ripple effect, on Friday, June 25, the Texas Supreme Court ruled that Facebook can be held responsible for its users engaging in illegal sex trafficking activities on the site, regardless of protections offered to the company in...www.biometrica.com
2) Apple can decline requests that are not possible. It is not possible for Apple to decrypt iCloud backups without the password, for example. Therefor they can not comply with such a request and must decline as they have done before.
3) The actual point, and you're made it for me quite nicely, is that now the capability will exist to access data that is outside of the encrypted data stored on iCloud servers. Apple can no longer decline to provide the data on the basis that it is impossible and they must expand the capabilities of on-phone analysis if compelled by authority. Apple opened the door and created the capability themselves. They can hardly refuse to use it afterwards.
False. See FOSTA-SESTA legislation
From the legal analysis:
“Additionally, FOSTA-SESTAputs further conditions on the applicability of 230, and platforms very much are legally required to remove child pornography.”
![]()
What Section 230 Is and Does -- Yet Another Explanation of One of the Internet's Most Important Laws
Section 230 of the Communications Decency Act immunizes internet platforms from any liability as a publisher or speaker for third-partywww.publicknowledge.org
Nobody said it was a country? It’s the direction courts are moving. You asked for case law, you got it. FOSTA-SESTA are very clear. Many internet tech sites warned of this in 2018 when it was passedTexas is a state, not a country.
The title could truthfully and appropriately be changed to:
"Craig Federighi Acknowledges All iCloud users can store up to 29 exact matching CSAM images"
At the 30th image, Apple will take minor steps to notify law enforcement after viewing all the images themselves first.
This doesn't sound legal.
You are wrong again. See the precedent with Facebook. Case law always starts at lower levels and works it’s way up. A state Supreme Court ruling is hugely impactful and lawyers around the country are taking noticeFalse. Only if they are KNOWINGLY aware of it. Meaning, someone should report it to them for them to take action. It does NOT mean they need to start looking at all their customers’ private content.
How can it be more clear? They are required to remove child pornography from their platform.False. Only if they are KNOWINGLY aware of it. Meaning, someone should report it to them for them to take action. It does NOT mean they need to start looking at all their customers’ private content.
The problem is if Apple turns a blind eye to it they can be held legally liable. I do not like the fact that the scanning is done on device but it is Apple's legal team that is most likely protecting the company's shareholders from a lawsuit that hurts their investment.
It is according to the law. So it looks like Apple is being forced to comply with going after CSAM content on their iCloud network in order to keep their section 230 immunity. is not “turning a blind eye” by respecting the privacy of user content. They are “opening an eye” by now examining it.
Not a blind eye but eyes with binoculars.The problem is if Apple turns a blind eye to it they can be held legally liable. I do not like the fact that the scanning is done on device but it is Apple's legal team that is most likely protecting the company's shareholders from a lawsuit that hurts their investment.
It’s not that difficult. In 2018, when this legislation was being debated, most tech sites like The Verge warned this would happen. You can’t blame Apple for seeing how courts are interpreting FOSTA-SESTA and becoming compliant before they are found liable as well is not “turning a blind eye” by respecting the privacy of user content. They are “opening an eye” by now examining it.
While the amount of children who are sexually abused is way too high (and the only acceptable number is zero), the number you quoted is waaay to high: https://www.rainn.org/statistics/children-and-teens
Secondly, most people do not abuse kids. We do not search everyone's home because some people commit crimes, you need a warrant. The same should apply to our online data.
Thirdly, the rise of the internet has made it exponentially easier to share CSAM. Whatever Apple catches with their system will unfortunately only be a drop in the bucket. Has anyone proven that cloud providers scanning everyone's photos for known CSAM actually decreases child abuse and creation of CSAM, rather than merely catching already known images? Apple has been happy to explain why their NeuralHash system makes everything OK, but they haven't explained if their system will actually decrease child abuse.
Fourthly, Apple says the threshold is 30 photos need to match. That means if Apple has 29 images that match the hash, making them extraordinarily likely to be CSAM, Apple will just leave them up on their servers? That sounds all kinds of illegal, or Apple's algorithm is truly s**t.
I would assume the file would be compressed to save space.Has anybody stated how big the database of image fingerprints stored on my iPhone is going to be. If child porn is such a huge problems the database is likely very large and may take up a significant chunk of my available storage.
Yes, they should have. But perhaps they didn’t want the public to think they will only do the right thing when legally required toIn addition it does not help that Apple's PR team totally botched this announcement and did not bring up that they are required to police their iCloud network for this type of content.
Nobody said it was a country? It’s the direction courts are moving. You asked for case law, you got it. FOSTA-SESTA are very clear. Many internet tech sites warned of this in 2018 when it was passed
The fingerprints aren’t stored on device. The hash comparison is done onlineHas anybody stated how big the database of image fingerprints stored on my iPhone is going to be. If child porn is such a huge problems the database is likely very large and may take up a significant chunk of my available storage.
Your are conflating two different programs. The 16 year old in your scenario will only get lectured if their parents enable that feature. Parents always have the right to monitor a child’s content, also, federal law has a very clear age of majorityStates vary on age of consent widely, as do countries. Children are also a different category than adolescents, by definition. I don’t see any of this reflected in this kind of fear-based legislation and I doubt the US Supreme Court would uphold any of these “precedents.” While real child sexual abuse is very concerning, lecturing 16-year-olds that there’s something potentially wrong with seeing someone naked in iMesssage goes way too far.
And yet they don't. They will permit up to 29 child porn images to be uploaded and stored on their servers before they take action. If they were following the law they would prevent any of it being stored and report the uploader immediately to law enforcement.How can it be more clear? They are required to remove child pornography from their platform.