Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Because they are supposed to respect our privacy more than the others…..even if it poses no threat in theory to our privacy. It’s about respecting their customers. We didn’t sign up to have a constant forensic analysis of our yoga pants pics and vaccine memes performed and shared with “don’t worry we’re just kiddy porn cops…” goons from the government

I’d be very interested in seeing how slightly changing the crime rationale affects the response from people. For example, change kiddy porn to vaxx denier memes, and if this is ok. Or election doubters?

Is it ridiculous to wonder if even discussing this gets us placed on a list, or scored on an existing one?
See: China and their “Social Credit Score” system.

Nope. You can keep that the hell away from my family and I.

Frankly, it’s extremely frightening to see how many American citizens are so eagerly willing to give up their freedoms. They won’t take it all at once, people. They’ll slowly chisel away at it until you have don’t have any freedom/rights remaining.

Be careful what you wish for.
 
The only potential bright spot here is.... If Apple is moving this function to the local device, is it possible they are FINALLY going to introduce end-to-end encryption for iCloud? Why else would they be moving this scanning to the local device?

But on the subject itself, this scanning seems pretty ineffective and easily defeated as regards catching the intended miscreants -- while simultaneously opening a very large can-o-worms, including expansion by a fascist (or fascist-wannabe) government who directs Apple to scan for political images, protest images, gun images, copyrighted works, and who knows what else. That can even happen in a so-called "free" country where the DOJ and judges are politicized.

While the stated purpose is good, I question the effectiveness and am really concerned about future abuse.
 
The only potential bright spot here is.... If Apple is moving this function to the local device, is it possible they are FINALLY going to introduce end-to-end encryption for iCloud? Why else would they be moving this scanning to the local device?

But on the subject itself, this scanning seems pretty ineffective and easily defeated as regards catching the intended miscreants -- while simultaneously opening a very large can-o-worms, including expansion by a fascist (or fascist-wannabe) government who directs Apple to scan for political images, protest images, gun images, copyrighted works, and who knows what else. That can even happen in a so-called "free" country where the DOJ and judges are politicized.

While the stated purpose is good, I question the effectiveness and am really concerned about future abuse.
Ah! Finally! Someone here is paying attention!
 
Welp that means that all the upset sex traffickers can rest easy and hide. SMH.
Any sex trafficker/child porn scumbag who was/is dumb enough to upload their images to Apple‘s ****ing servers probably isn’t smart enough to figure out this workaround, or even know about it.

Also, speaking for myself — someone who is not a sex trafficker and who THANK GOD is also not a child molester — I AM a ****ing storage hoarder, and if I turned off iCloud Photos right now, I’d be ****ED. Royally. I’m a filmmaker and amateur photographer and use my iPhone all the time to take photos and to shoot small video stuff. Basically, all the stuff that doesn’t require me to break out my mirrorless and it’s lenses. I’ve also uploaded a lot of photos and some videos from my mirrorless to iCloud.

Five minutes later: I felt compelled to look, and with the 2TB family plan I have — which I let my mom and cousin use — I‘m using 1.2TB of storage, compared to their combined 16.7GB, lol.*

From my 1.2TB, 815GB of that is from photos/videos. Jesus. The sad thing, I know that I’m never going to go through them and clean things up; I’ll just hop on the 3TB train when it pulls into the station, then the 4TB…

I guess what I’m saying is, if I grabbed my iPhone 12 Pro with it’s 128GB of storage, and turned off iCloud photos right now — I don’t think I could? At least not without transferring like 800GB of photos to a hard drive, or somehow moving them to a different cloud storage like Dropbox or Google Drive, or having to spend days sifting through 800GB of photos to decide which ones I NEED to keep.

To me, that would be it’s own circle of hell. Just that.

————————————

*I will point out, though, that my iCloud email address ends with @me.com, not icloud.com, so all the stuff I have uploaded is from over the course of like, twelve years or so now. I’m still bitter that I missed out on the @mac.com email address by like two months.

Anyone else on here a former MobileMe Vet, lol? How about any @mac.com holdouts?
 
  • Like
Reactions: haruhiko
Just re-watched season 1 of True Detective. Hard to get too mad at this.
 
I can still remember watching a news story about 1 hour photo developers would routinely look at people's photos. When they found "child porn" they would report it to the police. The story was about how often they go it wrong and what the consequences were.

Buy hey, AI, some really good math algorithms and what could possibly go wrong?
 
Any sex trafficker/child porn scumbag who was/is dumb enough to upload their images to Apple‘s ****ing servers probably isn’t smart enough to figure out this workaround, or even know about it.
Just like a thief stealing a catalytic converter they don't care about the consequences so I'm not sure the point of you telling me this.
Also, speaking for myself — someone who is not a sex trafficker and who THANK GOD is also not a child molester — I AM a ****ing storage hoarder, and if I turned off iCloud Photos right now, I’d be ****ED. Royally. I’m a filmmaker and amateur photographer and use my iPhone all the time to take photos and to shoot small video stuff. Basically, all the stuff that doesn’t require me to break out my mirrorless and it’s lenses. I’ve also uploaded a lot of photos and some videos from my mirrorless to iCloud.

Five minutes later: I felt compelled to look, and with the 2TB family plan I have — which I let my mom and cousin use — I‘m using 1.2TB of storage, compared to their combined 16.7GB, lol.*

From my 1.2TB, 815GB of that is from photos/videos. Jesus. The sad thing, I know that I’m never going to go through them and clean things up; I’ll just hop on the 3TB train when it pulls into the station, then the 4TB…
Hmm, okay. Not sure why you felt the necessity to tell me this, especially with so much circumvented profanity. 😉
I guess what I’m saying is,
Yeah cuz I don't know what you're saying at all. 😂
if I grabbed my iPhone 12 Pro with it’s 128GB of storage, and turned off iCloud photos right now — I don’t think I could? At least not without transferring like 800GB of photos to a hard drive, or somehow moving them to a different cloud storage like Dropbox or Google Drive, or having to spend days sifting through 800GB of photos to decide which ones I NEED to keep.

To me, that would be it’s own circle of hell. Just that.
Well alright. Thanks. 😊
 
Thats the most obnoxious argument ever When it comes to the privacy rights.
Agreed. 120%, so ****ing agreed. I despise this statement so much:


“If you're not doing anything wrong, then you have nothing to worry about.”


It comes back EVERY ****ING TIME when something like this comes up, and kind of challenges our right to privacy, not forcefully and bombastically, but with a small nudge. Those nudges add up, folks.

It’s infuriating to see so many people not give a ****.

You have to wonder if they ever considered that what is considered “wrong” is fluid? That what is acceptable/legal today, may not be acceptable/legal ten years from now. But sure, open the floodgates and give up your rights to privacy now, (albeit) behind honorable intentions. My friends: That’s how they get ya.


You want first hand experience? Here’s mine. I spent a few years, sadly, working for the government, spying/intruding on other peoples (NOT Americans) privacy. I’m not going to go into more detail than that, for obvious reasons. But I‘ve seen the other side of it; and I will say — emphatically — that it disgusted me, and fifteen years later, I still feel shame about it.

Treasure your privacy. Please.

Hey, howzabout some quotes! ENGAGE!

Eddie Snowden says, "Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say."

I like this one, a lot: “If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged.” (Supposedly from Cardinal Richelieu).

———————————————


Macrumors has been a BITCH to leave comments on over the past few months, when using the website on my iPad Pro. So I apologize ahead of time for any weird typos or grammatical errors. I promise, it ain‘t me, babe. It ain’t me.
 
Agreed. I applaud the intent but what happens when there’s an inevitable bug that causes somebody who did nothing wrong to have the cops show up at their door with a warrant? This is a little to Big Brother for me. I’ll shut off iCloud photos. I only recently turned it on anyway.
Actually, the intent is creepy. It is not like an elected official passed a law or there was a recent referendum. Apple seems to have just decided that this was "the right thing to do". What if they decided that monitoring hate, extremism, suicidal ideation, bullying or whatever we all now oppose, and however Apple wants to define these things, was the right thing to do?
 
What’s even the point? I feel like if you’re one of these people you were going to turn off iCloud right away to begin with.

These people are caught out pretty regular so I don’t think they’re that bright.
 
if countries have their own databases they could easily add other content to the search database.

perhaps meme images that insult the monarch (see Thailand's extremely harsh rules on making fun of the royal family), or a "free Tibet" flag in China.

If you have to go on a business trip to China, Thailand etc then suddenly you may end up in a kangaroo court and facing a long prison term for images that are legal in the USA.
The US will not be any better. It’s a slippery slope just how far you slip down it.
 
Apple is shooting themselves in the foot by announcing this, people will use their services less because of this announcement in ignorance or fear of false positives.

So help me if one false positive happens and gets media attention users will go bananas.

I get the impression that Apple didn’t have an option to opt out when big brother showed up.
 
Last edited:
You should read how file hashing works and how it's used to identify CSAM. No one "peruses" your pictures. Your pictures produce a unique hash value using either the MD5 or SHA hash, which doesn't reveal anything about the content of your photo - that hash is then compared to a very large database of known CSAM hash values. If there's a match, it kicks out a report to the appropriate LE agency (in the US, it usually goes to a local ICAC).

There has only ever been one SHA-1 hash collision in history, when scientists at google and CWI Amsterdam spent a ton of CPU/GPU time trying to do it using a formula, and they were successful only once in what was nine quintillion attempts. So it's pretty accurate!
this is NOT what apple is doing.
 
what’s next? scanning your stuff on iCloud for anti government materials for oppressive governments?
Apple is just one legislation away from scanning images of other “illegal” materials and the definition of what is illegal varies greatly across different jurisdictions. Of course, governments will always use “porn” or “children” as an excuse.

And most importantly, now even the USA is doing that, other governments can just follow suit to scan illegal images threatening national security “to protect their citizens” from being harmed by those images too. And yeah it’s not invasion of privacy not limitation of freedom, freedom has its limits right? If you have nothing to hide what’s to be afraid of? Oppressive governments always say something like this.
 


Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States.

iCloud-General-Feature.jpg

User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud Photos. Apple says that this is a highly accurate method for detecting CSAM and protecting children.

CSAM image scanning is not an optional feature and it happens automatically, but Apple has confirmed to MacRumors that it cannot detect known CSAM images if the iCloud Photos feature is turned off.

Apple's method works by identifying a known CSAM photo on device and then flagging it when it's uploaded to iCloud Photos with an attached voucher. After a certain number of vouchers (aka flagged photos) have been uploaded to iCloud Photos, Apple can interpret the vouchers and does a manual review. If CSAM content is found, the user account is disabled and the National Center for Missing and Exploited Children is notified.

Because Apple is scanning iCloud Photos for the CSAM flags, it makes sense that the feature does not work with iCloud Photos disabled. Apple has also confirmed that it cannot detect known CSAM images in iCloud Backups if iCloud Photos is disabled on a user's device.

It's worth noting that Apple is scanning specifically for hashes of known child sexual abuse materials and it is not broadly inspecting a user's photo library or scanning personal images that are not already circulating among those who abuse children. Still, users who have privacy concerns about Apple's efforts to scan user photo libraries can disable iCloud Photos.

Security researchers have expressed concerns over Apple's CSAM initiative and worry that it could in the future be able to detect other kinds of content that could have political and safety implications, but for now, Apple's efforts are limited seeking child abusers.

Article Link: Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off


Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States.

iCloud-General-Feature.jpg

User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud Photos. Apple says that this is a highly accurate method for detecting CSAM and protecting children.

CSAM image scanning is not an optional feature and it happens automatically, but Apple has confirmed to MacRumors that it cannot detect known CSAM images if the iCloud Photos feature is turned off.

Apple's method works by identifying a known CSAM photo on device and then flagging it when it's uploaded to iCloud Photos with an attached voucher. After a certain number of vouchers (aka flagged photos) have been uploaded to iCloud Photos, Apple can interpret the vouchers and does a manual review. If CSAM content is found, the user account is disabled and the National Center for Missing and Exploited Children is notified.

Because Apple is scanning iCloud Photos for the CSAM flags, it makes sense that the feature does not work with iCloud Photos disabled. Apple has also confirmed that it cannot detect known CSAM images in iCloud Backups if iCloud Photos is disabled on a user's device.

It's worth noting that Apple is scanning specifically for hashes of known child sexual abuse materials and it is not broadly inspecting a user's photo library or scanning personal images that are not already circulating among those who abuse children. Still, users who have privacy concerns about Apple's efforts to scan user photo libraries can disable iCloud Photos.

Security researchers have expressed concerns over Apple's CSAM initiative and worry that it could in the future be able to detect other kinds of content that could have political and safety implications, but for now, Apple's efforts are limited seeking child abusers.

Article Link: Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off

Confirms what I thought would come years ago. Hence, I’ve never used iCloud for photos…..knew it would come to this; and why I also do not synch my Documents folder with iCloud. Not kidding….I was wary of something like this (but not with KP as the initiator) years ago.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.