Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,645
38,074


Apple has started notifying users about an upcoming revision to its iCloud Terms and Conditions, which takes effect on Monday, September 16.

iCloud-General-Feature.jpg

We compared the text of the upcoming iCloud Terms and Conditions with the current U.S. version from September 18, 2023 and identified four key changes:

  • "Apple ID" references have been changed to "Apple Account" throughout.
  • iCloud users must agree to not "engage in any activity that exploits, harms or threatens children in any way, including without limitation producing, sharing, uploading or storing Child Sexual Abuse Material (CSAM)."
  • A clause was added about statutory rights under Australian consumer law.
  • "Effective October 26, 2024, for users in Bhutan, Brunei, Cambodia, Fiji, Laos, Macau, Maldives, Micronesia, Mongolia, Myanmar, Nauru, Nepal, Palau, Papua New Guinea, South Korea, Sri Lanka, Solomon Islands, Tonga, and Vanuatu, 'Apple' means Apple Services Pte. Ltd., located in Singapore. Prior to October 26, 2024, 'Apple' means Apple Distribution International Ltd. for such users."

The list of changes above is not comprehensive, and this article is not intended to provide any legal advice. Please read Apple's revised iCloud Terms and Conditions in full yourself before deciding if you will agree to them.

September 16 is the same day that Apple will be releasing iOS 18, macOS Sequoia, watchOS 11, and other software updates.

Article Link: Here's What's New in Apple's Updated iCloud Terms and Conditions Taking Effect Next Week
 
iCloud users must agree to not "engage in any activity that exploits, harms or threatens children in any way, including without limitation producing, sharing, uploading or storing Child Sexual Abuse Material (CSAM)."
Does that also include giving children and young teens an iPhone and then they get on all those social media apps that are bad for them and their mental health?
 
There is a clause about allowing apple to look at individual images in your library and md5 hash sums are viewable by Apple as well, even with advanced data protections. Nothing about automation but no reason they would ever need to mention it so all in all it sounds like CSAM scanning is probably in use.

The type of scanning that involves identifying images with AI probably isn't possible with ADP though.

/edit ADP might be safe from scanning if it happens locally. No indication that this is happening though.

Apple reserves the right to take steps Apple believes are reasonably necessary or appropriate to enforce and/or verify compliance with any part of this Agreement. You acknowledge and agree that Apple may, without liability to you, access, use, preserve and/or disclose your Account information and any Content to law enforcement authorities, government officials, and/or a third party, as Apple believes is reasonably necessary or appropriate, if legally required to do so or if Apple has a good faith belief that such access, use, disclosure, or preservation is reasonably necessary to: (a) comply with legal process or request; (b) enforce this Agreement, including investigation of any potential violation thereof; (c) detect, prevent or otherwise address security, fraud or technical issues; or (d) protect the rights, property or safety of Apple, its users, a third party, or the public as required or permitted by law. You acknowledge that Apple is not responsible or liable in any way for any Content provided by others and has no duty to screen such Content. However, consistent with Apple's privacy policy, Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may prescreen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable.
 
Last edited:
I wonder if this is one feature where AI will become a useful tool, asking AI to summarize or to pick out specific points of concern regarding privacy or anything specific one might be concerned about.

Of course that is all based on if you can trust AI.
 
  • Like
Reactions: KosherCoder
Aaaand I'm spending the rest of the day removing everything I have from iCloud. Not because I am doing anything wrong but because any moderation system is prone to abuse and false positives and I literally cannot afford to get all my **** shut down one day.

I actually want to entirely exist offline at this point. Back like 2010 again. Files on disk. Physical backups on disks off site.
 
Aaaand I'm spending the rest of the day removing everything I have from iCloud. Not because I am doing anything wrong but because any moderation system is prone to abuse and false positives and I literally cannot afford to get all my **** shut down one day.

I actually want to entirely exist offline at this point. Back like 2010 again. Files on disk. Physical backups on disks off site.
If you want to exist enitrely offline why did you post this online?
 
I wonder if this is one feature where AI will become a useful tool, asking AI to summarize or to pick out specific points of concern regarding privacy or anything specific one might be concerned about.

Of course that is all based on if you can trust AI.
This is the thing with ai and the whole "on-device" analysis that has been pushed in the name of "privacy". In my opinion this kind of techs are much more scary than any on server analysis because it makes any E2EE solution useless (since every thing is analyzed before it's encrypted or after it has been unencrypted like messages). It involves way to much trust in Apple or any big tech. Everyone freaks out with recall (for good reasons) but nobody mention the risks of apple intelligence on device analysis, especially this new context features where Siri can give you contextual features depending of whats on screen. I'm not saying apple intelligence is malicious as it is, but it opens the doors for infinite new possibilities of surveillance.
 
There is a clause about allowing apple to look at individual images in your library and md5 hash sums are viewable by Apple as well, even with advanced data protections. Nothing about automation but no reason they would ever need to mention it so all in all it sounds like CSAM scanning is probably in use.

The type of scanning that involves identifying images with AI probably isn't possible with ADP though.
This particular clause you quoted isn't anything new. It has been in there since the beginning of iCloud in 2011/2012...so even way before the whole CSAM debacle a couple years ago.

Here's what the Terms and Conditions looked like in 2012, where you'll find the quoted clause word for word.
 
This is the thing with ai and the whole "on-device" analysis that has been pushed in the name of "privacy". In my opinion this kind of techs are much more scary than any on server analysis because it makes any E2EE solution useless (since every thing is analyzed before it's encrypted or after it has been unencrypted like messages). It involves way to much trust in Apple or any big tech. Everyone freaks out with recall (for good reasons) but nobody mention the risks of apple intelligence on device analysis, especially this new context features where Siri can give you contextual features depending of whats on screen. I'm not saying apple intelligence is malicious as it is, but it opens the doors for infinite new possibilities of surveillance.

This is the problem of course.

It's not like there isn't history: https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html
 
  • Like
Reactions: iHorseHead
I should have stated it clearer: no account or cloud dependency on things I can't afford to lose. No Apple ID. No iCloud.
I do that partially, I keep all my pictures and music on local storage and then backup to Time Machine as well as a cloud service called BackBlaze which is my line of defence if my house burns down or my computer hard drive as well as my backup hard drive all break at the same time.
 
There is a clause about allowing apple to look at individual images in your library and md5 hash sums are viewable by Apple as well, even with advanced data protections. Nothing about automation but no reason they would ever need to mention it so all in all it sounds like CSAM scanning is probably in use.

The type of scanning that involves identifying images with AI probably isn't possible with ADP though.

/edit ADP might be safe from scanning if it happens locally. No indication that this is happening though.
This needs to be qualified: it only refers to "Content" that is transmitted to or stored in iCloud, not anything stored locally.

Apple's privacy policy is clear that Apple does not have the ability to decrypt ADP data. A notable exception is iCloud mail, which Apple cannot process without an encryption key. Another notable exception is shared content - E2EE is only available if all participants have ADP enabled.

I couldn't find the clause you refer to about allowing Apple to see images in the library or hashes.

Keep in mind most people share data through different services that are not subject to Apple's jurisdiction, so even if Apple is unable to access anything, someone else may be able to. So even if someone divorces themselves entirely from Apple's ecosystem, it's hard to imagine using any Internet-connected device without some risk. Frankly, Apple is the least of my worries.
 
  • Like
  • Love
Reactions: SFjohn and bernuli
I looked at my iCloud settings and it locked me out unless I agreed as well.

I have never used iCloud Photos, iCloud Notes, or iCloud backup anyway so there isnt much to scan especially with ADP on.

My solution is 2-3x a year, when I visit my parents I keep a 4TB drive there, its my off site backup, it goes in their fireproof safe when I am not there.
 
" 'Apple' means Apple Services Pte. Ltd., located in Singapore."

And maybe, giving the Apple Ireland EU issue, within a few months every other country will also be routing that Services revenue to Singapore rather than to Apple Distribution International Ltd which is based in, you guessed it, Ireland...
 
Last edited:
I looked at my iCloud settings and it locked me out unless I agreed as well.

I have never used iCloud Photos, iCloud Notes, or iCloud backup anyway so there isnt much to scan especially with ADP on.

My solution is 2-3x a year, when I visit my parents I keep a 4TB drive there, its my off site backup, it goes in their fireproof safe when I am not there.
"especially with ADP on", as long as the scanning doesn't occur on device
 
  • Like
Reactions: Charlesrfinal
I looked at my iCloud settings and it locked me out unless I agreed as well.

I have never used iCloud Photos, iCloud Notes, or iCloud backup anyway so there isnt much to scan especially with ADP on.

My solution is 2-3x a year, when I visit my parents I keep a 4TB drive there, its my off site backup, it goes in their fireproof safe when I am not there.
I like your solution.

I knew years ago that this CSAM thing would rear its ugly head all over again.

We STILL have people being held in prison without having had a trial. We can’t trust CSAM. We can’t trust Apple, Microsoft, Adobe, Nvidia, AMD, ARM, or any of them when they put one foot into the government’s bed.

They can already put anybody in prison for any reason that they desire. Or no reason at all; the fact is, they don’t have to prove anything. All they have to do is accuse you and plant evidence. Away you go to prison. Congratulations, CSAM just made it that much easier.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.