Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
He brings up a valid point that I’ve been asking. If the local photos library on the device is identical to the iCloud Photo Library (they should be if you are using photos in the cloud), why is there anything happening on the device side to begin with? They probably (hopefully) already scan user’s photos on their own servers for CSAM. Why break the user’s trust and do it on the client side as well?

The automatic filtering of explicit photos for kids with parental controls is a good idea though.
 
Overall, I have to agree with Stamos. I think it’s a good summary and overview of the different viewpoints involved.
 
let int canIOptOut = 0

OK, well that was hard for Apple to defeat. Maybe they wouldn’t even bother to announce they made this little change.
Meh... strawman about what could be and not what it really is. Why would they bother announce what they already have?
 
He brings up a valid point that I’ve been asking. If the local photos library on the device is identical to the iCloud Photo Library (they should be if you are using photos in the cloud), why is there anything happening on the device side to begin with? They probably (hopefully) already scan user’s photos on their own servers for CSAM. Why break the user’s trust and do it on the client side as well?

The automatic filtering of explicit photos for kids with parental controls is a good idea though.
I don't think Apple has access to the photos on the cloud level. They are all encrypted. The only way to scan them is on the device, before they are encrypted.
 
I don't think Apple has access to the photos on the cloud level. They are all encrypted. The only way to scan them is on the device, before they are encrypted.
Wrong. Apple can get to iCloud data. Heck, last year they turned over iCloud data of congresspeople and Washington post reporters while being gagged by a NSL. Those people all found out a couple of months ago that their data had been turned over to the government.


 
One thing that bothers me about what Apple is doing. The only way they could accomplish what they are doing is, generate an edge detect map of the photo, make that into a vector map, then send that to be compared to the library. This would be quite easy to do as they could use the JPEG compression functions to do it. The problem is, a competent programmer could use this data to reconstruct a decent rendition of the original photo.
 
Wrong. Apple can get to iCloud data. Heck, last year they turned over iCloud data of congresspeople and Washington post reporters while being gagged by a NSL. Those people all found out a couple of months ago that their data had been turned over to the government.
Well... they are encrypted though. While true Apple still has the keys to the locks, they clearly don't like to use them and only when forced by the Government. I suspect they want to move to not having the keys at all so they cant even be forced to turn them over.
 
your data, not theirs.
You should probably read facebook’s EULA to see exactly what rights you are giving up to facebook.

 
Wrong. Apple can get to iCloud data. Heck, last year they turned over iCloud data of congresspeople and Washington post reporters while being gagged by a NSL. Those people all found out a couple of months ago that their data had been turned over to the government.


If you read further, it was metadata or pen register data, not full dumps of the iCloud account. It was who they called and texted and when they contacted them.
 
Well, any comment coming from Facebook about privacy has little legitimacy in terms of digital sovereignty. I find nevertheless Alex Stamos' arguments very appropriate.
Apple is not that much concerned about privacy. If Apple would care about human being's sovereignty and privacy, iPhones would have two photo libraries: one that would be shared with other apps (like WhatsApp, Telegram, Instagram, iCloud, iMovie... any app) and another one that could not be reached by any app whatsoever.
Users would then configure whether they want, by default, to have their pictures in one library or another (the absolute private one or the reachable by other apps one). It is not because you want to seamlessly share pictures and videos that you should sacrifice real privacy. If you are concerned about real privacy, you might want to use, let's say, Telegram but not letting the app access to your whole photo and videos library. You should just be able to grant access to one specific photo and video library.

The same with any type of documents or contacts. There should be a contact library that can be accessed by other apps and another that should not be accessed by any app whatsoever. This would be indeed respecting user's (client's) privacy. The rest is mostly fantasy.
 
You should probably read facebook’s EULA to see exactly what rights you are giving up to facebook.

They own my car and my house now?
 
Too bad the way Apple has announced their new system has been all stick and no carrot. “Yes please, I’d like to sign up to have my phone snitch on me.”
That’s a gross misrepresentation of how their system works, though. Perhaps read their FAQ.

If the local photos library on the device is identical to the iCloud Photo Library (they should be if you are using photos in the cloud), why is there anything happening on the device side to begin with?
It’s to make code running at Apple’s side access less data. Doing it this way means they only need to access the photos in the rare positive cases. Doing it entirely serverside they would need to access all photos. Apple explained this in the FAQ.

I don't think Apple has access to the photos on the cloud level. They are all encrypted. The only way to scan them is on the device, before they are encrypted.
They do. Photos are not end to end encrypted.

Wrong. Apple can get to iCloud data.
Well, they can access some data, including photos. Not all data.
 
  • Like
Reactions: slineaudi
That’s a gross misrepresentation of how their system works, though. Perhaps read their FAQ.


It’s to make code running at Apple’s side access less data. Doing it this way means they only need to access the photos in the rare positive cases. Doing it entirely serverside they would need to access all photos. Apple explained this in the FAQ.


They do. Photos are not end to end encrypted.


Well, they can access some data, including photos. Not all data.
According to Apple, photos are end to end encrypted.

 
  • Like
Reactions: Just sayin...
According to Apple, photos are end to end encrypted.

Photos aren’t E2EE. I followed the link that you put and while Photos are encrypted on iCloud, it’s not E2E. Apple can decrypt Photos on server side if they want

E2E Encrypted stuff with Apple:
  • Apple Card transactions (requires iOS 12.4 or later)
  • Home data
  • Health data (requires iOS 12 or later)
  • iCloud Keychain (includes all of your saved accounts and passwords)
  • Maps Favorites, Collections and search history (requires iOS 13 or later)
  • Memoji (requires iOS 12.1 or later)
  • Payment information
  • QuickType Keyboard learned vocabulary (requires iOS 11 or later)
  • Safari History and iCloud Tabs (requires iOS 13 or later)
  • Screen Time
  • Siri information
  • Wi-Fi passwords
  • W1 and H1 Bluetooth keys (requires iOS 13 or later)
  • iMessages (but with iCloud backups enabled, Apple can restore and access to your messages)
 
Last edited:
Facebook is not making these statements. Its FORMER security chief is.
Precisely this.

Full disclosure: while I have not worked at Facebook, I have worked with Alex Stamos (I am a former IT Admin for iSEC Partners/NCC Group where Alex Stamos was a founding partner). His batting average isn't perfect (who's is?) but it's decidedly better than average in the field of computer security. While we worked together, Facebook was a client of iSEC Partners, and we helped FB implement TLS and performed IPv6 scans among other things. I departed iSEC Partners/NCC Group at the end of 2011, and Stamos later went on to found Artemis Security and later was CSO of Yahoo! He left Yahoo! if I understand correctly, after determining that there was a severe breach compromising most user accounts and when he wanted to implement a mandatory passphrase change, he received push back from other executives in the Board of Directors. Given that CSO levels are legally liable for security decisions, he decided that if he had his hands tied to the point where he couldn't do his job on such a basic level as mandating passphrase changes on known compromised accounts, then he didn't want to get paid for being Yahoo!'s fall guy. Seems like a wise move, and with the users' best interests at heart.

His departure from Facebook circa 2018 I seem to recall was over not entirely dissimilar grounds after information regarding the Cambridge Analytica breech came out. It appeared as if he did his best at damage control for a while, even going so far as to suggest that Mark Zuckerberg step down as CEO. After all, being implicated as leader of any organization which is facing involvement with war crimes and manipulating elections is a bad look, and like Bill Gates stepping down as CEO of Microsoft after they were losing antitrust lawsuits didn't stop Gates from being a multibillionaire, there is little wrong with letting someone else take over the reigns. Zuckerberg didn't acquiesce to such recommendations, and near as I can discern Alex Stamos thought it best to jump from a sinking ship in flames after he realized that no amount of damage control and bailing could save it.

Since then I guess he's been a staff researcher with Stanford? I took some issue with his apparent endorsement of Zoom a year or two ago (which has since lost an $85 million dollar lawsuit over their end-to-end encryption implementation being bollocks, as I could discern back when I looked at their "technical" documents), but as he sheepishly admitted to in a recent Youtube live stream with Mathew Green, David Thiel (another former iSEC Partners/NCC Group coworker of mine) et al over the Apple CSAM scanning decision recently, paraphrasing: "even some of the recent end-to-end encryption claims of vendors are difficult for experts to discern correctly." Albeit, that eating crow applies to things such as Apple's iCloud not being server-side encrypted and that Apple has repeatedly cooperated with law enforcement of particular concern in places such as China where they have handed over user data to governmental agencies (e.g. https://www.cpomagazine.com/data-privacy/icloud-data-turned-over-to-chinese-government-conflicts-with-apples-privacy-first-focus/#:~:text=Apple does not have the,be stored within the country.).

I'm more of a nerdy ops guy who fixates on bit level optimizations and Kolmogorov complexity reduction in code implementations and don't profess to even want to wear the sorts of hats that Alex Stamos has. Nonetheless, while I can state that I did help him unlock accounts in the past when we worked together, I've never testified in front of Congress as he has. Nor would I want to do such things. I think he and I still "fight for the users" more than not, but we may do so in different ways.

Suffice to say, Alex Stamos hasn't spoken on Facebook's behalf for a few years now, but even before he worked as CSO at Facebook, he was deeply involved in the field of computer security, and I consider him to be among the more seasoned and ethical practitioners in the field. Discounting someone's viewpoint because of a contextualization of one of their past employers, is a pretty narrow perspective, and given that I have worked as a janitor and far humbler positions at some pretty heinous employers in my past, I would sincerely question how anyone's past career history is a reflection on them as a person, in particular if it seems as if most of their actions as an employee were admirable. I know some with a lot of skeletons in their closets, myself included, but Alex Stamos didn't ever read that way to me.

The "screeching minority" categorizations of those who have raised concerns about Apple's local CSAM scanning implementation are more than concerning, they are dismissive. I don't think I've read nor heard a single perspective, even from trolls, advocating for child abuse. Meanwhile, hash collisions are a field of study within computer science which are widely known, with even cryptographic hashes such as MD5 and SHA-1 having fallen by the wayside in more recent years due to chosen prefix attacks. Another colleague and computer forensics expert, Cory Altheide, some years ago described to me malware which would drop files which had CSAM hash collisions so as to shift investigative onus, and I found that prospect chilling. The likelihood that similar techniques can and will be used against Apple's implementation, seems nonzero, and from my perhaps paranoid vantage, extremely likely as a means to facilitate unwarranted governmental eavesdropping based over spurious levels of "probable cause". For those who think that no one is that nefarious, perhaps they need some reminders of the threat levels of nation state level adversaries using known malware such as Pegasus sold by Israeli NSO Group.

At the very least, Apple's multi-million dollar ad campaign "Privacy. That's iPhone." now reads like Orwellian double speak with this most recent decision. As another colleague posited: it seems likely that this may have been a move which Apple was forced to take lest Tim Cook be faced with even worse choices from governmental pressure for cryptographic backdoors.

Regardless, the canary in the coal mine looks awfully dead to me.
 
Last edited:
Wow, it's like a mini cancel-culture starting to form here... "he's from Facebook so his views should be mocked with a snide comment and disregarded"... you learn by listening, not by shutting down conversations.
Well stated. I said in an earlier post.. Google, Facebook At least came out the gate telling everyone the truth up front. "WE USE YOUR DATA....." Apple, Not so much.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.