Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Texas is a state, not a country.
 
  • Like
Reactions: PC_tech
2) Apple can decline requests that are not possible. It is not possible for Apple to decrypt iCloud backups without the password, for example. Therefor they can not comply with such a request and must decline as they have done before.

3) The actual point, and you're made it for me quite nicely, is that now the capability will exist to access data that is outside of the encrypted data stored on iCloud servers. Apple can no longer decline to provide the data on the basis that it is impossible and they must expand the capabilities of on-phone analysis if compelled by authority. Apple opened the door and created the capability themselves. They can hardly refuse to use it afterwards.

Apple can decrypt everything in iCloud Photo Library and the backups since they hold a key.
 
  • Like
Reactions: VTECaddict
False. See FOSTA-SESTA legislation
From the legal analysis:

“Additionally, FOSTA-SESTAputs further conditions on the applicability of 230, and platforms very much are legally required to remove child pornography.”

False. Only if they are KNOWINGLY aware of it. Meaning, someone should report it to them for them to take action. It does NOT mean they need to start looking at all their customers’ private content.
 
  • Like
Reactions: PC_tech and Mega ST
Texas is a state, not a country.
Nobody said it was a country? It’s the direction courts are moving. You asked for case law, you got it. FOSTA-SESTA are very clear. Many internet tech sites warned of this in 2018 when it was passed
 
The title could truthfully and appropriately be changed to:

"Craig Federighi Acknowledges All iCloud users can store up to 29 exact matching CSAM images"​


At the 30th image, Apple will take minor steps to notify law enforcement after viewing all the images themselves first.

This doesn't sound legal.

They aren't notifying law enforcement agencies but NCMEC, a non-profit organization, which they are required by federal law at the moment they know for sure there is child pornography.

There is no obligation to actively search for child pornography as a service provider. It's explicitly stated in the federal law.

I don't think we will see any legal ramifications for Apple.
 
False. Only if they are KNOWINGLY aware of it. Meaning, someone should report it to them for them to take action. It does NOT mean they need to start looking at all their customers’ private content.
You are wrong again. See the precedent with Facebook. Case law always starts at lower levels and works it’s way up. A state Supreme Court ruling is hugely impactful and lawyers around the country are taking notice
 
Craig can put all the lipstick he wants onto this pig. I still ain't gonna kiss it.

Again, Apple could have attached vouchers on every single uploaded photo of their users for scanning on their cloud. If the scanner matched any CSAM it would escalate and notify. If nothing was matched, delete the voucher and post it.

Why the heck did Apple need to put the CSAM database and scanner on the local device? It is the height of hubris to think that governments and hackers won't target it for more nefarious purposes. No other company has done it this way, so how is this method supposed to be 'better'?

For everyone claiming it's already been done and the screeching concern is unjustified, then why are there so many concerns from privacy groups, security experts, the EFF and Apple Employees as well?

Heck, why are so many people in this forum telling the concerned to sit back and just accept it? If this matter is so unconcerning to you, why bother replying in this forum, it's unimportant enough to even participate...
 
False. Only if they are KNOWINGLY aware of it. Meaning, someone should report it to them for them to take action. It does NOT mean they need to start looking at all their customers’ private content.
How can it be more clear? They are required to remove child pornography from their platform.
 
  • Like
Reactions: MozMan68
The problem is if Apple turns a blind eye to it they can be held legally liable. I do not like the fact that the scanning is done on device but it is Apple's legal team that is most likely protecting the company's shareholders from a lawsuit that hurts their investment.

 is not “turning a blind eye” by respecting the privacy of user content. They are “opening an eye” by now examining it.
 
 is not “turning a blind eye” by respecting the privacy of user content. They are “opening an eye” by now examining it.
It is according to the law. So it looks like Apple is being forced to comply with going after CSAM content on their iCloud network in order to keep their section 230 immunity.
 
 is not “turning a blind eye” by respecting the privacy of user content. They are “opening an eye” by now examining it.
It’s not that difficult. In 2018, when this legislation was being debated, most tech sites like The Verge warned this would happen. You can’t blame Apple for seeing how courts are interpreting FOSTA-SESTA and becoming compliant before they are found liable as well
 
  • Like
Reactions: MozMan68
In addition it does not help that Apple's PR team totally botched this announcement and did not bring up that they are required to police their iCloud network for this type of content.
 
  • Like
Reactions: VTECaddict
Has anybody stated how big the database of image fingerprints stored on my iPhone is going to be. If child porn is such a huge problems the database is likely very large and may take up a significant chunk of my available storage.
 
While the amount of children who are sexually abused is way too high (and the only acceptable number is zero), the number you quoted is waaay to high: https://www.rainn.org/statistics/children-and-teens

Secondly, most people do not abuse kids. We do not search everyone's home because some people commit crimes, you need a warrant. The same should apply to our online data.

Thirdly, the rise of the internet has made it exponentially easier to share CSAM. Whatever Apple catches with their system will unfortunately only be a drop in the bucket. Has anyone proven that cloud providers scanning everyone's photos for known CSAM actually decreases child abuse and creation of CSAM, rather than merely catching already known images? Apple has been happy to explain why their NeuralHash system makes everything OK, but they haven't explained if their system will actually decrease child abuse.

Fourthly, Apple says the threshold is 30 photos need to match. That means if Apple has 29 images that match the hash, making them extraordinarily likely to be CSAM, Apple will just leave them up on their servers? That sounds all kinds of illegal, or Apple's algorithm is truly s**t.

I wasn’t referring to only America. Also even then when looking at American statistics it’s quite known that they don’t include minorities well.

Abuse is subjective, if you ask most millennials, gen x, silent generation and baby boomers they will say yes they endured some abuse from their parents wether it’s physically,emotionally,sexually or verbally. So no we cannot say most people don’t abuse children.

And with your last 2 points I agree with that’s why I said this whole thing is very iffy to me. But I also said I can see why people are ok with it.
 
  • Disagree
Reactions: triptolemus
Has anybody stated how big the database of image fingerprints stored on my iPhone is going to be. If child porn is such a huge problems the database is likely very large and may take up a significant chunk of my available storage.
I would assume the file would be compressed to save space.
 
In addition it does not help that Apple's PR team totally botched this announcement and did not bring up that they are required to police their iCloud network for this type of content.
Yes, they should have. But perhaps they didn’t want the public to think they will only do the right thing when legally required to
 
  • Like
Reactions: Stunning_Sense4712
Nobody said it was a country? It’s the direction courts are moving. You asked for case law, you got it. FOSTA-SESTA are very clear. Many internet tech sites warned of this in 2018 when it was passed

States vary on age of consent widely, as do countries. Children are also a different category than adolescents, by definition. I don’t see any of this reflected in this kind of fear-based legislation and I doubt the US Supreme Court would uphold any of these “precedents.” While real child sexual abuse is very concerning, lecturing 16-year-olds that there’s something potentially wrong with seeing someone naked in iMesssage goes way too far, as is allowing a company to look through the private content on your phone for crimes on behalf of the government.
 
  • Like
Reactions: dgrey
Here’s a better explanation of this whole process (but first some context):

Your iPhone has a neural engine inside it. The neural engine already scans your photos on device to index your images. This makes them searchable. That’s why when you search for the term ‘dogs‘ or ‘trees’ for instance, the phone is able to bring up the shots you requested. It’s also how the memories feature works. Heck it’s how the faces feature from iPhoto back in the day worked (but without the benefit of a dedicated neural engine). All of this ‘scanning’ that people are worried about is happening locally on device. No data is shared with Apple for any of the features described above.

So how does the neural engine do this?

The neural engine is trained by Apple with machine learning algorithms. Taking the example of a Dog, the neural engine knows what a Dog looks like because it was trained with a specific algorithm that taught it what Dogs look like. When you take future photos, if the neural engine finds that the photo matches to it’s ’Dog algorithm’ it gets indexed and sorted as such. The next time you search your library for the term ‘Dogs’, the photo you took is surfaced.

And how does this relate to CSAM?

In the simplest sense, Apple has created an on device machine learning algorithm that looks for CSAM. But much narrower in scope. Narrower than the aforementioned ‘Dog algorithm’. The algorithm has been programmed with neural hashes that represent known CSAM images from a database. This means CSAM not in the datebase won‘t be detected as the algorithm doesn’t recognise a match. At this stage in the process, nothing changes for you as the user and your privacy has not been affected one bit. If you’ve been happily using photo memories, or searching your library for shots you took on that nice family trip to Disney World then understand this. This was achieved in the same way that Apple detects CSAM on device.

User saves photo > photo is run through neural engine on device > photo gets categorised and indexed as ‘Dog’ > user is able to search their library at will for photos of Dogs and find that photo they just saved

But what about my iCloud Photo Library?

Here is where the CSAM feature differs. If and only if the machine learning algorithm on your device has detected known CSAM photos, then those specific photos that were flagged get scanned a second time by another algorithm in iCloud as they get uploaded. Those photos and those photos alone become viewable by Apple. A human reviewer than checks to make sure the algorithm didn’t make a mistake. If it did make a mistake then nothing happens. But if confirmed CSAM content is matched, your account is reported.


Just to summarise

1. Do you use photo memories or faces or ever search your library by location or using key words? All of that happens on device and Apple never sees your photos. And the scanning for CSAM works in exactly the same way but with a much narrower scope. If you don’t like the idea of the system scanning for CSAM then I hate to break it to you, but your phone has been doing on device machine learning and scanning of your photos for many years.

2. Apple isn’t scanning your entire iCloud Photo Library. They’re running a second scan on any images that were first detected and flagged using on device machine learning. This happens as the image gets uploaded.

Machine Learning has been happening on our devices for years for many of the features we use in the photos app. The benefit of keeping it on device is it’s private. No data is processed in the cloud.

The only real thing that has changed with this new system, is that if known CSAM photos were detected using on device machine learning (around 30 images), they would then be scanned on upload to iCloud and get reviewed by a human.

Nothing about the way your iPhone works has changed. Nothing about how your photos are stored or processed has changed. The only time any processing or scanning of images happens in iCloud is if your iPhone detected CSAM images on device using the same old neural engine that’s been running in the same old way that it has since the iPhone X.

My personal view is this. The technology being used is the most thoughtful and privacy preserving implementation used by any company so far. My only concern is if governments try to force Apple to widen the scope of their on device machine learning algorithm. But I also appreciate that as Craig pointed out, the security community would being able to determine if Apple lied to us quickly enough. And Apple has multiple audit levels to prevent interference through government or other means.

I think the fact that Apple has been transparent about this and told us is very reassuring. They could have chosen not to. Which would have been morally wrong. But if it was their intent to start abusing our trust by covertly scanning for other things in the cloud, then telling us upfront about this feature would be a pretty dumb way to start.

Based on everything Apple has done for privacy so far I’m willing to continue to trust them. They didn’t do a good job of explaining this technology but I do believe they’re trying to do what is right, not what is easy. If in the future they go back on their word regarding widening the scope of the system to detect other things, I’ll be the first to call them out on it and call for them to be sued into the ground.

Until then I remain confident in Apple’s commitment to privacy.
 
Last edited:
States vary on age of consent widely, as do countries. Children are also a different category than adolescents, by definition. I don’t see any of this reflected in this kind of fear-based legislation and I doubt the US Supreme Court would uphold any of these “precedents.” While real child sexual abuse is very concerning, lecturing 16-year-olds that there’s something potentially wrong with seeing someone naked in iMesssage goes way too far.
Your are conflating two different programs. The 16 year old in your scenario will only get lectured if their parents enable that feature. Parents always have the right to monitor a child’s content, also, federal law has a very clear age of majority
 
How can it be more clear? They are required to remove child pornography from their platform.
And yet they don't. They will permit up to 29 child porn images to be uploaded and stored on their servers before they take action. If they were following the law they would prevent any of it being stored and report the uploader immediately to law enforcement.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.