Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The point is it’s automatic, it’s been enabled silently for everyone who has iCloud enabled (which is most people - Apple prompts you to enable this when you set up a device).

I have no problem with Apple checking that you’re not storing bad **** on their servers, but they should be getting your informed consent first. That means telling you exactly what they’ll be looking for and who they’ll be notifying if they find it (eg “we search for known pro-democracy images and notify the CCP when we find some”).
Wrong again . You have to enable iCloud first then OPTIN to this
 
Hardly. Its not about how it works. Its about Apple violating privacy. They have no right at all to view or scan private images. No right at all.
Its a warrantless search they are doing. Violation of consitutional rights even.

Apple can put that as a requirement for you to use their services (property). Since you have given permission it's not a warrantless search.

Also wouldn't the illegality of warrantless searches only apply to the government and not private entities?
 
I think everyone just needs to read about it from the horse's mouth:


Right, I've read through that.

So once again, why not do the scanning / safety voucher assignment in the cloud? As it stands, if someone isn't using iCloud Photos, they could still have images on their device that are "flagged" as such. They simply haven't left the device yet.

Unless, and only if, this entire process is only "turned on" when a user has iCloud Photos enabled. But Apple isn't clear on that.
 
  • Like
Reactions: Johnny907
I don’t know if you are just aloof

but

this applies to iCloud data which Apple already shares with feds or local cops if they ask for it .
What Apple does NOT give is end to end encrypted data which is not on the cloud.

iCloud photos are NOT end to end encrypted

this is exactly why celebrity photos from iPhones have been hacked

this is exactly why Apple can share iCloud data with cops

this is exactly why Apple cannot scan your photos if you have iCloud photos off
… the scanning is done on device, not on Apple’s servers. Reread the article because you’re clearly confused. I don’t mean that as a cut towards you, rather Apple as this announcement is one huge quagmire of terrifying confusion.
 
This is very unexpected and worryingly dystopian. Who decides what's illegal? The government, not Apple. Apple happens to start with child pornography because it's something everyone agrees is bad, and the moment you say anything against this system, you're labeled a pedophile so it's very hard to argue against it.

But what will happen in other countries with less agreeable laws? In Russia, gay pornography may soon be illegal. In China, all forms of pornography is illegal. Hell, even showing pictures of Winnie the Pooh is illegal, and anything deemed "anti government" may result in a visit from their fleet of black mobile execution busses that murder you on-site and then distribute your organs to those who didn't say anything against the government yet while they're still fresh.

It may start with child porn but this is literally a tiny step away from Big Brother, with governments being able to instantly see who is a potential "enemy of the state", flag them up and have them oppressed, deported or executed. Russia, China and Belarus are going to have a blast mass-murdering their anti-government citizens with the help of these algorithms.

Add to this the Pegasus scandal and now you have exactly what everyone has been being paranoid about: inescapable high tech government surveillance with the power of AI.


Also, wait a sec, if it only scans for specific child porn images, how the hell does it know if you're being sent "a nude image"? Then surely it scans against a more general match too, not just a given database of specific images. Imagine your phone alerting your parents if your boyfriend/girlfriend sends you a nude. Who the hell would want that? What kind of world is this? Welcome to being surveilled by AI, your parents, big corporations and the government, all at once. Sounds great doesn't it?

Oh and with all the talk of "all your data on your iPhone being encrypted", is that a big fat lie then? Or at the very least, misleading? People probably think their data is unreadable to anyone without the passcode and yet that definitely isn't true then.
 
Last edited:
… the scanning is done on device, not on Apple’s servers. Reread the article because you’re clearly confused. I don’t mean that as a cut towards you, rather Apple as this announcement is one huge quagmire of terrifying confusion.

It’s done on device but on the way to Apple’s servers.
Because it’s less scary than scanning 100% of the pics once they are already on the servers.
 
THIS SAYS IT ALL apples-plan-think-different-about-encryption-opens-backdoor-your-private-life.

I've been using Apple products since I bought the original Mac in 1984. Throughout the years Apple was about privacy. Betas of iOS 15 and macOS Monterey PREACH about all the new levels oof PRIVACY being built in to SAFARI, MAIL etc. Now THIS BS. I'm done with Apple. I've recently updated my hardware to M1 iMac, MBA and iPad Pro 11". I wouldn't tolerate the gov't sniffing around like this let alone a commercial entity.

Tim Cook is a hypocrite. How can you advocate end to end encryption while essentially saying your are going to be snooping on all texts and photos? What's next; all email? Where does it end. How many of our constitutional protections is Apple going to violate?

I would't be surprised if TC is using this as posturing to curry favor with Congress etc when it attempts to place regulation on the way big tech operates and handles social media.

My 35 years of being an Apple fanboy are over! Where's Edward Snowden?
 
Right, I've read through that.

So once again, why not do the scanning / safety voucher assignment in the cloud? As it stands, if someone isn't using iCloud Photos, they could still have images on their device that are "flagged" as such. They simply haven't left the device yet.

Unless, and only if, this entire process is only "turned on" when a user has iCloud Photos enabled. But Apple isn't clear on that.

Read the last paragraph:
This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.

In other words, since the scanning is done on your phone, Apple never learns any info about your photos unless 1. they're flagged as matching known CSAM AND 2. you upload enough of those flagged photos to iCloud exceeding the threshold.

If they're scanning directly on iCloud, that's their own servers and thus they have access to that scanning info, so it's not as private.
 
People fail to understand that these initial types of changes are never universally admonished. Just like when the Jews had to wear yellow stars.
 
I have two kids, and like many, many other parents, when they were babies and toddlers I used to take pics of them playing in the house, outside or in the bath in diapers or underwear, or even nude. Obviously the intention is to capture memories of my kids doing hilarious things, and to privately save them for the future memories. But how would apple know this?

Because they are only scanning for known pictures of child pornography being actively shared. They aren't scanning for child pornography which isn't shared.

Your nude pictures, which aren't pornography and aren't shared, won't have a hash in the database.
 
  • Like
Reactions: BigMcGuire
Apple can put that as a requirement for you to use their services (property). Since you have given permission it's not a warrantless search.
I have of course not given permission and never will.

I‘m curious how your apology is once they expand this functionality to all images (and videos, etc), not just those stored in iCloud. And they of course will do that, its just a matter of time
 
Read the last paragraph:


In other words, since the scanning is done on your phone, Apple never learns any info about your photos unless they're 1. flagged as matching known CSAM AND 2. you upload enough of those flagged photos to iCloud exceeding the threshold.

If they're scanning directly on iCloud, that's their own servers and thus they have access to that scanning info, so it's not as private.

Again, I get that. But if the argument is taking action on images that are housed on their servers, then why scan on device at all? You lose encryption once you upload to iCloud anyway. Why go to all of the trouble to implement this scanning process on device?

This is not a great analogy, but it's like my HOA coming into my house blindfolded and scanning the barcodes of everything in my house. If an item is flagged as stolen, they won't do anything about it until it leaves my house. But as soon as it leaves my house and enters the neighborhood, they let the local authorities know it's there. Should I get in trouble for having a stolen item in my house? Absolutely. But the police should be initiating that process, not my HOA. And the police would be able to get a warrant and actually take action IN my house...

It's simply an overstep by Apple to do it like this.
 
Well, yes. If they violate my personal integrity it would be called theft. Or burglary.

I don't think it is illegal to violate your personal integrity (how do you that?) unless you mean the integrity of your body.

If you give Apple permission to search your stuff there are no legal issues. Apple will require you to give such permission to be allowed to use iCloud. So no laws are broken in the US.
 
The technology used here will be repurposed at some point. That Apple, with billions in some offshore tax haven is caving, guarantees it will a) get worse and b) the other guys are doing or will be doing it.

Thing is, the other guys don’t lie to me about caring and their products are significantly cheaper. I pay extra for privacy and security that was just a lie.
 
  • Like
Reactions: Schismz and Dionte
This is very unexpected and worryingly dystopian. Who decides what's illegal? The government, not Apple. Apple happens to start with child pornography because it's something everyone agrees is bad, and the moment you say anything against this system, you're labeled a pedophile so it's very hard to argue against it.

But what will happen in other countries with less agreeable laws? In Russia, gay pornography may soon be illegal. In China, all forms of pornography is illegal. In Japan, showing genitals is illegal. In China, showing pictures of Winnie the Pooh is illegal, and anything deemed "anti government" may result in a visit from their fleet of mobile execution busses that murder you and then distribute your organs to those who didn't say anything against the government yet.

It may start with child porn but this is literally a tiny step away from Big Brother, with governments being able to instantly see who is a potential "enemy of the state", flag them up and have them oppressed, deported or executed.
What’s all over? Your crimes?
Ever taken a nude photo of yourself in the mirror at the age of 17? Congratulations. You're now officially in possession of child porn. Ever viewed porn, and didn't realize that the person in the photo/video is underage despite not looking underage, and not thought much about it? Well now that's a 20 year prison sentence for you. Ever heard of devices and accounts getting hacked? Well now someone can strategically upload child porn to your iCloud account to get you automatically incriminated to make you disappear for a few years. Great for oppressive governments that want to get rid of their anti-government activists. Ever wanted to win an election against a stronger opponent? Time to put some child porn on their iCloud account. Now they're automatically a sex offender and whatever they say it sounds terrible for them.

Just think of the possibilities, beyond the marketing.
 
  • Like
Reactions: peanuts_of_pathos
It doesn't seem like anybody read any details about what Apple is actually doing here (not a surprise). This appears to include Snowden and the EFF (also not much of a surprise).

Apple is not scanning all your pictures looking for certain types of content that "might" be images of child sexual exploitation or abuse.

Rather, there is a database of known child sexual exploitation images maintained by law enforcement. Cryptographic hashes are generated for each of these known images/files.

The only thing Apple is scanning for are files that match the cryptographic hashes of known images of child sexual abuse. They are not looking at your images using machine learning, or anything close to that.

If this is something you feel is worthy of criticism, go to town. But criticize what they are actually doing, not some inflated imaginary version of what they're doing.

Oh, and as per yesterday's article on the subject -- Apple and all other major tech companies have already been doing this for years. It's nothing new.
 
  • Like
Reactions: peanuts_of_pathos
#1 - A middle school girl, we'll say 13 years old, sends a photo of herself topless to her 14 year old boyfriend.

#2 - A 16 year old boy takes a video of him and his 15 year old girlfriend...

#3 - An adult is looking at pornographic pictures and saves one that ends up being an underaged girl - she's 17 but he thought she looked 21. He's now in possession of what could be considered child pornography, and for all we know it could be an image that matches up with something on the NCMEC. Does his account deserve to get shut down and reported to NCMEC?

Personally I'm against porn in general, of all types and ages, because I think it's unhealthy, and also one of the primary funding sources for for sex trafficking. But even I, who would be considered by most to be a prude in this area, still see over-reach here and think this is an extremely slippery and dangerous precedent.

#1 won't be flagged since its not a known child pornography picture in the database. Also she is probably too old.

#2 won't be flagged since its not a known child pornography picture in the database. Definitely too old.

#3 won't be flagged because the girl is too old to be in the database.

The database contains mostly prepubescent children in explicit sexual poses or acts. Teenage sex and nudity photos are not the target for this system.
 
  • Like
Reactions: bryn0076
This is very unexpected and worryingly dystopian. Who decides what's illegal? The government, not Apple. Apple happens to start with child pornography because it's something everyone agrees is bad, and the moment you say anything against this system, you're labeled a pedophile so it's very hard to argue against it.

Banks, trading platforms and KYC/AML-complying cryptocurrency exchanges have to report “weird stuff” to the government, IRS, etc. Doesn’t mean they decide what’s illegal. Doesn’t mean they need a “warrant”, it’s probably written in terms and conditions somewhere.
Same for cloud hosts and CP, I suppose.
 
This is very unexpected and worryingly dystopian. Who decides what's illegal? The government, not Apple. Apple happens to start with child pornography because it's something everyone agrees is bad, and the moment you say anything against this system, you're labeled a pedophile so it's very hard to argue against it.

But what will happen in other countries with less agreeable laws? In Russia, gay pornography may soon be illegal. In China, all forms of pornography is illegal. In Japan, showing genitals is illegal. In China, showing pictures of Winnie the Pooh is illegal, and anything deemed "anti government" may result in a visit from their fleet of mobile execution busses that murder you and then distribute your organs to those who didn't say anything against the government yet.

It may start with child porn but this is literally a tiny step away from Big Brother, with governments being able to instantly see who is a potential "enemy of the state", flag them up and have them oppressed, deported or executed.

Ever taken a nude photo of yourself in the mirror at the age of 17? Congratulations. You're now officially in possession of child porn. Ever viewed porn, and didn't realize that the person in the photo/video is underage despite not looking underage, and not thought much about it? Well now that's a 20 year prison sentence for you. Ever heard of devices and accounts getting hacked? Well now someone can strategically upload child porn to your iCloud account to get you automatically incriminated to make you disappear for a few years. Great for oppressive governments that want to get rid of their anti-government activists. Ever wanted to win an election against a stronger opponent? Time to put some child porn on their iCloud account. Now they're automatically a sex offender and whatever they say it sounds terrible for them.

Just think of the possibilities, beyond the marketing.
Do you seriously think Apple hasn’t considered the possibility of people having these kinds of images uploaded against their will? If you’ve thought of it, so has Apple. They’re not dumb.
 
One also has to ask, why now? Apple had to realize there would be backlash to this since they label themselves as the privacy champions. If there was a need to introduce a tool like this on everyones' devices because of external pressure what better way to introduce a tool like this than under the banner of "fight against child pornography".
 
That's not what he's saying. He's saying that if Apple puts this in there, then what's stopping governments from going "well, you put that in there, perhaps you can put THIS in there so we can check for things. Oh, and can you give us access just in case there are other crimes there, I mean, you DID put in this encryption back-door, so we know you can do that...so you can do this other thing too"

It isn't an encryption backdoor. Most of what you store in iCloud is readable to Apple. They even document it.
 
It isn't an encryption backdoor. Most of what you store in iCloud is readable to Apple. They even document it.

A locally assigned “CP or not” label is not an encryption backdoor.
It’s something running locally when the device is up and running.
By that logic even looking at photos is an encryption backdoor because you’re looking at them so the iPhone must have decrypted the data at some point.
Not to mention giving access to the camera roll to third party internet-connected apps…you’re saying you trust Apple less than those apps..
 
Again, I get that. But if the argument is taking action on images that are housed on their servers, then why scan on device at all? You lose encryption once you upload to iCloud anyway. Why go to all of the trouble to implement this scanning process on device?

So you'd rather they be analyzing all your photos on iCloud, where they can decrypt and view them vs. on your phone where they can't? I'm honestly not understanding your logic here. This new method is extremely more privacy-friendly.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.