Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Rather than just reacting based on no facts, get informed.
Funny. I did check the facts, thank you.

Again - it does not matter at all, not one tiny little bit HOW they are doing it. They developed a tool, designed to spy on their customers. Irrelevant how sophisticated it is.

And please don‘t get me started on “only when iCloud turned on“. That is for now, maybe. Unlikely it stays like that
 
No, this does not solve the problem and now you’re just being flippant.

I am happy to consent to Apple scanning my photos in iCloud. Apple could enable server-side scanning, and I would not stop using iCloud because of it.

With the current implementation, turning off iCloud photos does not prevent the scanning software from being installed on my device. They just *promise* that it won’t be used, and that’s not good enough for me, or for the other people voicing concern here.

So how do you feel about the feature which can delete every app you have on your phone being installed as part of the OS?
 
Well, I'm up for a new phone this year and also was waiting on the M2 Macbooks. The keyboard on my 2018 MBP has turned into pure unreliable garbage compared to the 2012 it replaced, so I'm already pretty irritated. Just paid $1500 for an iPad Pro/keyboard in September. Guess it's time to review the path forward now that 1984 has finally made its way to Apple.
I'm up for a new iPhone too, my battery is starting to go. Before this week I wouldn't have thought twice about it, I would have just done it. Now I probably will still buy one, but it might not be my main phone anymore, and certainly not as high a priced one. I'll be purchasing a new Samsung Flip before that and that will probably become my main phone. I love the form factor WAY better than brick style, and since the iPhone is not private anymore, there's nothing that really sets it apart. Though I do wonder how my Apple watch will do without it always near.
 
  • Like
Reactions: Handsome Bacon
Rather than just reacting based on no facts, get informed.
Since you've done the work and informed yourself, could you please describe how Apples algorithm on hash matching works? They call it "neural match". I have of course read their technical summary on their technology, but it's so vague, it's raising alarms. They're obviously computing the hash based on image features and not pixel values. So while you're explaining in detail how they do it, could you also elaborate the risk given adversarial attacks? Maybe even in comparison to hash functions based on pixel values? (I have read the important papers on it, as it's part of my research, so if you need to, deep dive).

What exactly are their image descriptors they get from their embedding networks? What's the threshold for perceptually and semantically descriptors that are "close" vs "far apart"?

How are results transferred back to Apple? Can the traffic be blocked by a local user firewall or are they using their technology to bypass the normal network stack, so this can't be blocked locally and an external firewall is needed to block traffic to "Apple HQ"?

I have more questions, but let's start easy here, then see where we go.
 
This question is completely non-sequitur to the topic. No, I do not approve of it. And not approving of this unrelated feature does not undermined my stance on warrantless searches and surveillance here.

And yet, this feature has been on iPhones since 2008.

Why buying and starting using a device with such a feature if you oppose it?
 
Again, please get informed regarding the problem that Apple is trying to resolve and how they are doing it. Or do not back up your photos to iCloud and the problem is solved. Use Google Drive, One Drive, Dropbox, etc. and ignore the real invasions of privacy happening there.
I, for one, am informed about what Apple is trying to do, at least in principle. I understand their motivation, but their system is spyware, nothing more, nothing less.

I just hope you don't take a selfie in front of a statue of Cupid, or of holding a certain album cover from Nirvana, or take any intimate photos of your partner, or even of your kids in swimsuits. And if you take such pictures in a series, if one is flagged as a false positive, others are likely to be flagged as well. Then some human being at Apple (hopefully vetted not to have a criminal record in pedophilia or stalking) will decrypt your private photos and look them over. Just remember the false positives flagged up by Apple's hair-brained system are not going to be pictures of cats. They are going to be pictures of people, some of which might be sensitive.
 
And yet, this feature has been on iPhones since 2008.

Why buying and starting using a device with such a feature if you oppose it?
Maybe you need to learn about consent? I don’t like that feature, but I am willing to use my phone despite of it. I do not wish to consent to warrantless searches on my phone.

For the record, there are a lot of other things that I wish would change about iOS, but I am still willing to consent to use my iPhone despite of them. That still doesn’t undermine my stance here.

Consent is a very important concept. I’d be concerned if it’s something you can’t grasp.
 
I, for one, am informed about what Apple is trying to do, at least in principle. I understand their motivation, but their system is spyware, nothing more, nothing less.

I just hope you don't take a selfie in front of a statue of Cupid, or of holding a certain album cover from Nirvana, or take any intimate photos of your partner, or even of your kids in swimsuits. And if you take such pictures in a series, if one is flagged as a false positive, others are likely to be flagged as well. Then some human being at Apple (hopefully vetted not to have a criminal record in pedophilia or stalking) will decrypt your private photos and look them over. Just remember the false positives flagged up by Apple's hair-brained system are not going to be pictures of cats. They are going to be pictures of people, some of which might be sensitive.
Nice set of whataboutisms playing into each other, but I'm not worried about anything written above. But YMMV.
 
  • Like
Reactions: VulchR
But the chances of this catching the CREATOR of the content is very low. There are a lot of sick people out there that just find these images and save them. But they are not the ones that did the abuse or even take the picture. While its sick that someone would want these pictures, I actually want to see the creators get jail time or worse than someone just saving these pictures.

The purpose of this system isn't to stop the creators directly.

It's to reduce the flow of money in this market, thus reducing the incentive to create new material for some creators.
 
So how do you feel about the feature which can delete every app you have on your phone being installed as part of the OS?
I don't like it, but it doesn't change the trust any. I've had apps disappear, no biggy. I just replaced tham with something else. Those apps aren't mine, the data is, and deleting my data is better than scanning it and reporting me to the government if they think it's illegal -- by a large margin.
 
Exactly this….either way just about every kid over 7 now has a porn browsing device in their pocket with parents who have either little knowledge or desire to attempt to restrict the device.
And many kids over the age of 12 are making and sharing porn…all of which qualifies as child porn.
 
So do not back up your photos to iCloud. Problem solved.
There is no Apple option to auto sync photos between devices without using iCloud (you can transfer using cable). You have to have/find some other 3rd party way to do it.

I have no issues with them scanning my iCloud photos but I'm worried for those in countries who will take this (on device) technology and exploit it (not to mention that it could be hacked). I pay for the lowest tier of iCloud but I'm thinking of moving my photos (and stop paying for iCloud services) just because I want to show my disagreement with their policy.
 
No, the system is not designed to detect CP.

It's design to do the following things:

1. Identify photos which are copies or near derivatives from the picture in the CSAM database
2. Extremely good at NOT identifying other photos

It's #2 which makes this tool so inefficient to catch pornography in general or people participating in a protest.

It's good at catching iconic images shared by many people.
Uhh, if you consider the images in the CSAM database widely shared iconic photos, maybe you misspoke. Sadly we can assume the CSAM database is rapidly expanding so yes the system is designed to detect CP. The technical article doesn't say if or how often pictures are rescanned or what triggers rescans. Why would Apple want to catch pornography, that's legal?
 
I’m out.

The same people again and again not acknowledging that for all intents and purposes Apple’s on-device_but_actually_cryptographically_frozen_until_it’s_uploaded scan is EQUIVALENT to server-side scan privacy-wise will soon be forgotten like tears in the rain.

To me it doesn't matter where the scanning is done if the scanning has to happen anyway and it's the exact same scanning procedure. It's the result of the scan which is important to me.

And since the scans happens only if I use iCloud Photo Library it's easy for me to avoid this if I want.
 
To me it doesn't matter where the scanning is done if the scanning has to happen anyway and it's the exact same scanning procedure. It's the result of the scan which is important to me.

And since the scans happens only if I use iCloud Photo Library it's easy for me to avoid this if I want.
It may not matter to you, but it matters to the US Constitution. If this were a government organization, they would absolutely need a warrant to do ANY kind of search on your personal property. When it comes to Apple’s own server infrastructure, that’s different.
 
To me it doesn't matter where the scanning is done if the scanning has to happen anyway and it's the exact same scanning procedure. It's the result of the scan which is important to me.

And since the scans happens only if I use iCloud Photo Library it's easy for me to avoid this if I want.

And I think that's where people (at least those trying to stay rational in this discussion) differ.

I mentioned this in another post, but I have extremely high privacy expectations (and requirements) for my personal device (it's also my work device, with access to sensitive client information). I don't have those expectations or requirements in the cloud, as that is Apple's space that I'm simply renting.
 
Why in the world would Apple talk about this at WWDC? That's a developer conference for developers to create their apps. Also since when would any company talk about controversial things they are doing at live conferences?
Because it's a feature of iOS 15 and they announce and discuss iOS features at WWDC. Next time I suggest watching the WWDC Keynote before commenting.
 
Uhh, if you consider the images in the CSAM database widely shared iconic photos, maybe you misspoke. Sadly we can assume the CSAM database is rapidly expanding so yes the system is designed to detect CP. The technical article doesn't say if or how often pictures are rescanned or what triggers rescans. Why would Apple want to catch pornography, that's legal?

No, I meant the system is good at catching iconic photos of all kinds. The one picture from Tiananmen Square is an iconic picture. There is only one original of it from that angle.

Let's say you go to Tiananmen Square today and take pictures of the square. You can't use those picture to catch people with the iconic version on their phone. Not even if you put tanks on the square.

When I was talking about the design goal I was talking about the type of algorithms Apple are using. These types of algorithms weren't invented by Apple. Their applications are more general. The system is good at finding photos derived from the pictures in the CSAM database, but if you put hashes of other kinds of photos in there, the system would still work. Therefore not designed to catch CP.

Catching copyrighted images would be another good application of this system.

Let's say I take a picture of Times Square. This system would be good at finding copies of my photo but not other photos of Times Square.
 
  • Like
Reactions: VulchR
It becomes a security and privacy issue because Apple now has full access to my entire device and can scan for whatever they want without me knowing about it. That is why I prefer this type of software to stay in the cloud.

They have always had that power!

Apple can copy everything on the device unless it was encrypted outside Apple's functions. They can delete every app, wipe the entire phone.

Apple already have code on the iPhone which can scan every file on the file system.
 
It may not matter to you, but it matters to the US Constitution. If this were a government organization, they would absolutely need a warrant to do ANY kind of search on your personal property. When it comes to Apple’s own server infrastructure, that’s different.

The local scanning process is not remotely controlled by an external entity, it lives 100% inside the phone and has no way to communicate with the world outside. It’s like a wiretap that’s not connected to anything, just sitting there. It only “wakes up” when the photos are uploaded to Apple’s server. That makes it tricky to consider it a search on personal property without attaching a lot of asterisks and nuances to that. That’s why the technical implementation is super important to understand. (the opposite of what the “it’s not about the tech” people would make you believe with their blunt hyperbolic buzzwordy takes)
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.