Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
But what that doesn't include is "send that content and your name to the police."

Having said that, CSAM tends to be subject to mandatory reporting laws that would override the terms of service anyway. But if this were to expand to other material besides CSAM, the question of whether Apple has the right to report you to authorities starts to get more complicated.
Neither does their current CSAM proposal - read and inform yourself. They are passing it to other entities that may pass it to the police.

They have also directly talked about the slippery slope topic and hit it head on.
 
The only scanning Apple is doing is at the behest of a subpeona.
Out of 21.4 million items filed regarding CSAM, Apple only reported 265. They have by far the lowest number of any. While Apple won’t confirm if they are doing iCloud scanning, the numbers don’t support this activity.
"We may also use your personal information for account and network security purposes, including in order to protect our services for the benefit of all our users, and pre-screening or scanning uploaded content for potentially illegal content, including child sexual exploitation material."

That is part of the user Agreement from 2019. Apple doesn't need a subpoena in order to scan user data. They have already been doing that. That is how they have been able to fie CSAM reports already. In order to get law enforcement involved with specific user's data, a subpoena is required.
 
  • Like
Reactions: dk001
Doesn’t mean they do it. If Apple is scanning the iCloud for CSAM, how are they only reporting 265 instances last year?
Apple won’t, so far, come out and say specifically what they are / are not doing. However the numbers don’t support a scanning claim.

For the year 2020 - CSAM reports:
Total: 21.4 million
Facebook: 20,307,216
Google: 546,704
SnapChat: 144,095
Microsoft: 96,776
Twitter: 65,062
IMAGR: 31,571
TikTok: 22,692
DropBox: 20,928
Apple: 265

Something about the low numbers reported by Apple doesn’t sound right either. Is Facebook simply that much more popular than iCloud for storing child pornography, are the bulk of offenders using android phones, or is Apple getting away with doing just the bare minimum to get law enforcement off their backs?
 
  • Like
Reactions: Pummers and dk001
Something about the low numbers reported by Apple doesn’t sound right either. Is Facebook simply that much more popular than iCloud for storing child pornography, are the bulk of offenders using android phones, or is Apple getting away with doing just the bare minimum to get law enforcement off their backs?
CSAM includes self CSAM and folks use Facebook for communications both Facebook, WhatsApp and Instagram.
 
Something about the low numbers reported by Apple doesn’t sound right either. Is Facebook simply that much more popular than iCloud for storing child pornography, are the bulk of offenders using android phones, or is Apple getting away with doing just the bare minimum to get law enforcement off their backs?

Because Apple has not been scanning iCloud photos for CSAM like the others:

TC: Most other cloud providers have been scanning for CSAM for some time now. Apple has not. Obviously there are no current regulations that say that you must seek it out on your servers, but there is some roiling regulation in the EU and other countries. Is that the impetus for this? Basically, why now?

Erik Neuenschwander: Why now comes down to the fact that we’ve now got the technology that can balance strong child safety and user privacy. This is an area we’ve been looking at for some time, including current state of the art techniques which mostly involves scanning through entire contents of users’ libraries on cloud services that — as you point out — isn’t something that we’ve ever done; to look through users’ iCloud Photos. This system doesn’t change that either, it neither looks through data on the device, nor does it look through all photos in iCloud Photos. Instead what it does is gives us a new ability to identify accounts which are starting collections of known CSAM.

CSAM includes self CSAM and folks use Facebook for communications both Facebook, WhatsApp and Instagram.

Though in Apple’s process, CSAM hashes created on a user’s device are compared to existing hashes. New CSAM with a hash that does not match an existing hash is not flagged.
 
  • Like
Reactions: dk001
“U.S.-based Electronic Service Providers report instances of apparent child pornography that they become aware of on their systems to NCMEC’s CyberTipline. “

So they are reporting what they find. Given the services that the above companies provide I don’t find it all that shocking that Apples numbers are low.

If they are actually scanning iCloud photos and Mail I would be amazed with those numbers.
 
Did you even read that law before you posted it?

The only thing that says is that you are required to report it to the NCMEC. It does not say anything about prohibiting you from reporting it to other people.
Yes, I read the law and cited the applicable law. Instead of attempting to insult me, do your own research. Any provider that finds CSAM has to report to NCMEC and NCMEC contacts the proper agency if the report is found to be valid. The law is clear. Apple cannot contact the authorities first in lieu of contacting NCMEC. Apple stats this same procedure as well.
 
  • Like
Reactions: dk001
Yes, I read the law and cited the applicable law. Instead of attempting to insult me, do your own research. Any provider that finds CSAM has to report to NCMEC and NCMEC contacts the proper agency if the report is found to be valid. The law is clear. Apple cannot contact the authorities first in lieu of contacting NCMEC. Apple stats this same procedure as well.
I did do my own research, and you completely misunderstood the law. Of course Apple can contact the authorities. The fact that I point out you are wrong does not mean I am being rude.

Also, I'd like to point out that you changed your story by adding "in lieu of contacting NCMEC." Yes, Apple does have to contact the NCMEC, but they are also free to contact anyone else they want.
 
I did do my own research, and you completely misunderstood the law. Of course Apple can contact the authorities. The fact that I point out you are wrong does not mean I am being rude.

Also, I'd like to point out that you changed your story by adding "in lieu of contacting NCMEC." Yes, Apple does have to contact the NCMEC, but they are also free to contact anyone else they want.
In regards to CSAM, Apple must contact NCMEC first.

I didn't change my story about anything. I simply made sure I was clear and concise with my previous post.
 
First you said, "By law, Apple can only report possible CSAM to NCMEC." Then you said, "In regards to CSAM, Apple must contact NCMEC first." Those are completely different.

Please post the exact portion of the law that you claim says "reporters are not permitted to contact other officials" (or is it "reporters are not permitted to contact other officials first").
 
Because Apple has not been scanning iCloud photos for CSAM like the others:





Though in Apple’s process, CSAM hashes created on a user’s device are compared to existing hashes. New CSAM with a hash that does not match an existing hash is not flagged.
Correct but self CSAM was in relation to the numbers presented for 2020 not in relation to apples proposal.
 
First you said, "By law, Apple can only report possible CSAM to NCMEC." Then you said, "In regards to CSAM, Apple must contact NCMEC first." Those are completely different.

Please post the exact portion of the law that you claim says "reporters are not permitted to contact other officials" (or is it "reporters are not permitted to contact other officials first").
We can have a pissing match on this all day, fact of the matter is Apple isn’t reporting to law enforcement they are reporting it to NCMEC who is working with law enforcement.
 
  • Like
Reactions: dk001
"We may also use your personal information for account and network security purposes, including in order to protect our services for the benefit of all our users, and pre-screening or scanning uploaded content for potentially illegal content, including child sexual exploitation material."

That is part of the user Agreement from 2019. Apple doesn't need a subpoena in order to scan user data. They have already been doing that. That is how they have been able to fie CSAM reports already. In order to get law enforcement involved with specific user's data, a subpoena is required.

I agree! I was just replying to the “fact” many are stating that Apple scans the iCloud like other providers. They don’t.
 
We can have a pissing match on this all day, fact of the matter is Apple isn’t reporting to law enforcement they are reporting it to NCMEC who is working with law enforcement.

Unless it is directly a result of a subpeona, warrant, or similar document.
Reading some of these posts I think many are confusing what Apple finds on their own vs what they are told to go and directly look at by law enforcement.
 
I wonder if Apple will implement this on Mac's. I'm sure there are people with Mac's but no iPhones but use iCloud to backup there photos?

I'm still interested in technical details such as how much will this eat my iPhone battery, data, etc. Or will it only happen when plugged in on WiFi? Will it happen all the time or once and then every?? days
 
  • Like
Reactions: pdoherty
I wonder if Apple will implement this on Mac's. I'm sure there are people with Mac's but no iPhones but use iCloud to backup there photos?

I'm still interested in technical details such as how much will this eat my iPhone battery, data, etc. Or will it only happen when plugged in on WiFi? Will it happen all the time or once and then every?? days
Apple has already stated it is coming for Macs on Monterey. It will happen any time an upload to iCloud occurs is my understanding.
 
  • Like
Reactions: Pummers
I wonder if Apple will implement this on Mac's. I'm sure there are people with Mac's but no iPhones but use iCloud to backup there photos?

I'm still interested in technical details such as how much will this eat my iPhone battery, data, etc. Or will it only happen when plugged in on WiFi? Will it happen all the time or once and then every?? days

I’m curious how they will update the on device hash database.
 
End of day while I applaud Apple’s solution, the fact that it is on device just pushes me a little bit further into continued shrinking of the amount of Apple hardware I own / use. Especially as there was no real need for Apple to go to this extent. After all the digging I have done, Apple’s failure to state “why?” this had to be the solution is a black mark.
 
  • Like
Reactions: LinusR
End of day while I applaud Apple’s solution, the fact that it is on device just pushes me a little bit further into continued shrinking of the amount of Apple hardware I own / use. Especially as there was no real need for Apple to go to this extent. After all the digging I have done, Apple’s failure to state “why?” this had to be the solution is a black mark.
The question is "why not"?

I think people have been so conditioned to believe that you are no longer entitled to privacy the moment you upload a file online that they cannot envision any other way of doing things.
 
The question is "why not"?

I think people have been so conditioned to believe that you are no longer entitled to privacy the moment you upload a file online that they cannot envision any other way of doing things.

Even that would be a benefit if Apple communicated ”why not”. Silence in this case, especially after the “borked” initial communication, would be helpful.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.