Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
That depends on one thing:

They were already doing this server side, and have been for years. But was that processing done for all photos when iCloud was enabled, or just the ones we upload? Is it going to be done for all now, or only those we upload?

Because if it's only those we upload, and it only did it before for those uploaded, then nothing has changed. If anything, we have more privacy now, as it's done on our devices, not their servers.
 
Nope, no upgrade for me. In fact, my own research group is down the hall from our security research group who consult for several governments and agencies across the globe and it looks like no one there will upgrade at this point. I guess this is where the line is drawn. Scan in the cloud, not on device, problem solved.
Could you elaborate? My feeling is that scanning in the cloud is WORSE and not better than scanning on device. Could you educate me why I should rethink that?
 
That depends on one thing:

They were already doing this server side, and have been for years. But was that processing done for all photos when iCloud was enabled, or just the ones we upload? Is it going to be done for all now, or only those we upload?

Because if it's only those we upload, and it only did it before for those uploaded, then nothing has changed. If anything, we have more privacy now, as it's done on our devices, not their servers.
Were they doing this server side? It's clear Facebook and Google were: in 2019, Facebook and Google reported hundreds of thousands of CSAM each, and Apple reported 205. Could you educate me on the circumstances Apple was scanning server side?

EDIT: Quoted the 2019 numbers, but looked at the same page and 2020 numbers are there, too, with similar ratios.
 
It's certainly dampened my enthusiasm. I'd been hotly debating in my mind whether to get a new MBA and iPad Mini this year. I really wanted them. M1 made Apple exciting again. Really struggling against the temptation to buy whenever there was a sale. In the hopes of a colored MBA with multiple monitor support and thinner bezeled iPad Minis.

Now I look upon them with apathy. I have no enthusiasm for them anymore. It's just a device to serve a function and nothing else. Why waste money on upgrades I don't need? So, I guess I should thank Apple. They just saved me $1,500.

I'll just use my 2015 MBP until it's too old, slow or worn to be useful. Then dump it and just go with whatever laptop has the best specs for the price.

Not that I've been a huge fan the last fifteen years. My enthusiasm was already waning. Windows vs macOS and Android vs iOS wasn't a big deal to me. Both do the same job fine and I can flick back and forth without issues (except SMS sync annoyances).

Maybe they'll do something to spark my interest again. As it stands now. When I look at threads or new articles about rumored products. It all just seems so hollow.
 
Could you elaborate? My feeling is that scanning in the cloud is WORSE and not better than scanning on device. Could you educate me why I should rethink that?
I can choose what I put in the cloud and what not. The same can be said about Twitter/Facebook/etc. If I choose to put something in the cloud that is not E2EE, I have to be aware that someone can access it. I either accept that or switch to a provider that allows E2EE (check Devon Think).

Apple is implementing a functionality to scan data deep into the OS. You can not undo this, it's there. For now, it's applied to photos and for now you can opt-out, but it can be applied to any document on your device with a flip of a switch. I have confidential government information on my devices, think about what could happen. I can encrypt data on my device, but when I access it, so can Apple. Also don't forget you can't block this, as Apple has implemented functionality in the past that allows them to bypass local firewalls, so traffic to/from Apple servers can't be blocked anymore. You'd have to use an external firewall.

Arguments are made that this happened in the past with searches and virus scanners. Sure, the difference is they don't report back to Apple (or anyone else) about the results and that makes all the difference. It's hard to implement specific technology into the OS without resistance, it's much easier to change its behavior. Apple is using the CP excuse in order to take the first step. And once it's deeply embedded in the OS, the damage is done.

Also, there's no reason not to do it in the cloud. As long as they don't have E2EE they can scan iCloud. Once they offer E2EE the whole thing won't work anymore. They say they're manually reviewing results in case an account is flagged. How are they going to do that with E2EE and without a direct backdoor to a device? It's not possible.

More food for thought:
https://www.hackerfactor.com/blog/index.php?/archives/929-One-Bad-Apple.html,

We'll see where the whole thing goes. It's not going to fly outside the US. We also might be able to do something about it. Then we can send adversarial examples to other people and watch the outcry. But until all these questions are answered, Apple is on my blacklist (and I've used and worked with them for decades).
 
I can choose what I put in the cloud and what not. The same can be said about Twitter/Facebook/etc. If I choose to put something in the cloud that is not E2EE, I have to be aware that someone can access it. I either accept that or switch to a provider that allows E2EE (check Devon Think).

Apple is implementing a functionality to scan data deep into the OS. You can not undo this, it's there. For now, it's applied to photos and for now you can opt-out, but it can be applied to any document on your device with a flip of a switch. I have confidential government information on my devices, think about what could happen. I can encrypt data on my device, but when I access it, so can Apple. Also don't forget you can't block this, as Apple has implemented functionality in the past that allows them to bypass local firewalls, so traffic to/from Apple servers can't be blocked anymore. You'd have to use an external firewall.

Arguments are made that this happened in the past with searches and virus scanners. Sure, the difference is they don't report back to Apple (or anyone else) about the results and that makes all the difference. It's hard to implement specific technology into the OS without resistance, it's much easier to change its behavior. Apple is using the CP excuse in order to take the first step. And once it's deeply embedded in the OS, the damage is done.

Also, there's no reason not to do it in the cloud. As long as they don't have E2EE they can scan iCloud. Once they offer E2EE the whole thing won't work anymore. They say they're manually reviewing results in case an account is flagged. How are they going to do that with E2EE and without a direct backdoor to a device? It's not possible.

More food for thought:
https://www.hackerfactor.com/blog/index.php?/archives/929-One-Bad-Apple.html,

We'll see where the whole thing goes. It's not going to fly outside the US. We also might be able to do something about it. Then we can send adversarial examples to other people and watch the outcry. But until all these questions are answered, Apple is on my blacklist (and I've used and worked with them for decades).
Ah, thanks, that's an angle I hadn't considered.

As to the question of how they can evaluate pictures, evidently there are these "safety vouchers" attached to the photos, which are encrypted and Apple doesn't have the key UNLESS you've reached the threshold of matched photo fingerprints. These "safety vouchers" have included in them a derivative of the photo, essentially a very lo-res version of it. So Apple can examine the photo, at least in lo-res detail, without actually seeing the photo itself.

But we're on ground that I am really just repeating what I've read elsewhere, not based on personal knowledge.
 
As to the question of how they can evaluate pictures, evidently there are these "safety vouchers" attached to the photos, which are encrypted and Apple doesn't have the key UNLESS you've reached the threshold of matched photo fingerprints. These "safety vouchers" have included in them a derivative of the photo, essentially a very lo-res version of it. So Apple can examine the photo, at least in lo-res detail, without actually seeing the photo itself.
Yes, but they wouldn't need it without E2EE. If they'd have a decryption key to E2EE data, then they'd be able to access all your data making the whole process of E2EE pointless. The only other option is a backdoor.
 
I am not comfortable with this move by Apple and do not completely agree with it, however, no, it will not affect my decision to upgrade. The only change happening here is that they are moving the scanning from the iCloud end, to on-device.
That depends on one thing:

They were already doing this server side, and have been for years. But was that processing done for all photos when iCloud was enabled, or just the ones we upload? Is it going to be done for all now, or only those we upload?

Because if it's only those we upload, and it only did it before for those uploaded, then nothing has changed. If anything, we have more privacy now, as it's done on our devices, not their servers.
The scanning is/was done in iCloud on any photos that were uploaded. In iOS 15 they're moving the scanning to on-device only for photos that get uploaded to iCloud and they are only scanned at the time of upload. They will not be randomly going through scanning your entire phone's photo library. And if you disable iCloud Photos, nothing gets scanned.
 
I would expect there are levels to this:

--I am not worried, I'd still buy a new iPhone.
--I am a little worried, I'd buy a new iPhone but I will consider not using iCloud Photo Library
--I am somewhat worried, I will definitely be turning off iCloud Photo Library and putting off buying a new iPhone
--I am definitely worried, I will not being buying a new iPhone
--I am upset, I will be leaving the iPhone ecosystem and switching to a different platform
 
Yes, stepped approach.
Ordering an SE now instead of waiting for iPhone 13 and will remain on iOS 14.
Will cease to use iCloud Photos and cancel my iCloud 200 GB plan.

If Apple pushes this further, I will stop using iOS completely. But I am taking a wait and see approach.
They will make less revenue from me, completely because of the way they added this functionality.
Drop in the bucket, or ocean, but all I can do.
 
  • Like
Reactions: ericwn and Cycom
Basically the question is in the title y’all.

I would have created a poll but I’m using Tapatalk.

I’m on the fence whether this would sway me over the dark side but usually after  adopts something the whole tech world usually jumps on board or it blows up in the mainstream if it had already been done on Android.

I would just be leery of giving up my  watch and having to switch over to Android.

I also thought I would never say this but I did switch I would most likely get the Pixel 6
From what I have read Google, MS, FB and the other big tech companies are already doing this in one form or another.
 
  • Like
Reactions: Runs For Fun
From what I have read Google, MS, FB and the other big tech companies are already doing this in one form or another.
I think the difference is everyone (including Apple, was) are doing it on server side. Nobody was doing it on the phone side until now, pioneered by Apple. Imo that's the issue that many people are having. It's one thing to scan what's on their own server. But to go into the user's phone and do the scan there and code in the hashes into the OS for all iPhones worldwide? That's a bit much.
 
Not that I've been a huge fan the last fifteen years. My enthusiasm was already waning. Windows vs macOS and Android vs iOS wasn't a big deal to me. Both do the same job fine and I can flick back and forth without issues (except SMS sync annoyances).

Maybe they'll do something to spark my interest again. As it stands now. When I look at threads or new articles about rumored products. It all just seems so hollow.

It's done the same thing to me and my enthusiasm.

Very bummed about it all
 
I don't see this as much of a change. Moving the scanning to on device instead of on the server? I'm still not quite getting the outrage. Scanning for matches on hashes is not very invasive. I think people forget that Apple already has software scanning your photos for things like tagging and matching names. How does the intent of the software make it more risky for privacy? If Apple isn't trustworthy, then why was scanning your photos before alright but now scanning for CSAM a dangerous slippery slope? Apple could have repurposed the photo scanning for names and tags without telling anyone since that software was already installed. Instead they did the responsible thing and created a dedicated scanner that can only be used for one thing and told the world about it.
 
That depends on one thing:

They were already doing this server side, and have been for years. But was that processing done for all photos when iCloud was enabled, or just the ones we upload? Is it going to be done for all now, or only those we upload?

Because if it's only those we upload, and it only did it before for those uploaded, then nothing has changed. If anything, we have more privacy now, as it's done on our devices, not their servers.

Yes that is essentially what is happening.

If you have iCloud photos turned on, the photo will be matched against the known CSAM hashes and if there’s a match  will be notified and send it off to the agency that protects the kids (can’t remember the name right now LOL)

 can’t see what the image is at all and doesn’t know about the images in your library it is only matching the hashes.
 
  • Like
Reactions: Saladin12
I'm OK with Apple; the recent events and my understanding of them have not affected my opinion.

Striking the balance between personal freedoms and societal benefit is very tough. Striking the balance between various conflicting issues that all aim to benefit society is tough. At this point I'm just going with my gut and saying that Apple is making the call I would have made.
 
Yes that is essentially what is happening.

If you have iCloud photos turned on, the photo will be matched against the known CSAM hashes and if there’s a match  will be notified and send it off to the agency that protects the kids (can’t remember the name right now LOL)

 can’t see what the image is at all and doesn’t know about the images in your library it is only matching the hashes.
This seems to be correct with one additional protection, it takes more than one match on the database for Apple to be notified. The threshold is currently not revealed but it is more than one match.
 
  • Like
Reactions: Saladin12
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.