Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Like Apple is scanning our photos just in case someone out there has an image that is illegal to possess?

Do you see the irony of your statement there? Apple is assuming OUR guilt, and is acting as an investigator and mandatory reporter, by violating our right to privacy and violating the 4th amendment regarding unwarranted search without cause.
No. Because Apple is only scanning it when you try to send it to their servers. They can't scan it on the server side because they promised you the data would be encrypted and they would never be able to see it for themselves.
This is Apple KEEPING their promise of privacy.
 
No. Because Apple is only scanning it when you try to send it to their servers. They can't scan it on the server side because they promised you the data would be encrypted and they would never be able to see it for themselves.
This is Apple KEEPING their promise of privacy.
Please. If you open your iCloud photos in a browser, it's unencrypted.
Don't tell me Apple can't un-encrypt your stuff.
And before you say "Well, it's encrypted by your password"... umm... keychain... on apple servers.
 
  • Like
Reactions: Euronimus Sanchez
Until somebody figured out what they were doing and outed them. As noted by somebody else: Apple almost certainly made this announcement when they did to beat getting outed.

Uh-huh, accusing Apple of something they *almost* certainly did... 🙄

Really? And you've been writing software in a production environment for how many years?
About 30 years. I've been programming professionally since before I graduated from highschool. How long have you been doing it?
 
Please. If you open your iCloud photos in a browser, it's unencrypted.
Yes, AFTER you have entered your password. They can't do it without that.
Don't tell me Apple can't un-encrypt your stuff.
Apple can't decrypt your stuff without your credentials.
And before you say "Well, it's encrypted by your password"... umm... keychain... on apple servers.
The keychain is encrypted on Apple's servers and still needs your master password to decrypt.
 
Yes, AFTER you have entered your password. They can't do it without that.

Apple can't decrypt your stuff without your credentials.

The keychain is encrypted on Apple's servers and still needs your master password to decrypt.
I have been a systems administrator for 30+ years.
The only thing that keeps me out of my users accounts is MY integrity. Not Apple's. Not Microsoft's.
You don't honestly think that if Apple wanted into your encrypted files, they couldn't get into them? Even by forcing a password reset on your iCloud account?
 
I have been a systems administrator for 30+ years.
The only thing that keeps me out of my users accounts is MY integrity. Not Apple's. Not Microsoft's.
Are all your user's files encrypted?
You don't honestly think that if Apple wanted into your encrypted files, they couldn't get into them? Even by forcing a password reset on your iCloud account?
Do you honestly know for a fact that they can? Maybe they can by forcing a reset as you say, I'm not sure.

My point is that Apple clearly states that Photos are encrypted "in transit":

https://support.apple.com/en-us/HT202303

That precludes them from scanning them on their servers.
 
Are all your user's files encrypted?

Do you honestly know for a fact that they can? Maybe they can by forcing a reset as you say, I'm not sure.

My point is that Apple clearly states that Photos are encrypted "in transit":

https://support.apple.com/en-us/HT202303

That precludes them from scanning them on their servers.
Marvin, I've ready your other posts. You and I have similar backgrounds; both developers for a long time, a lot of systems stuff.

You know as well as I do, if you need into someone's account, you change their email address, force a password reset, and change their password and log in as them.
 
Marvin, I've ready your other posts. You and I have similar backgrounds; both developers for a long time, a lot of systems stuff.

You know as well as I do, if you need into someone's account, you change their email address, force a password reset, and change their password and log in as them.
That can work on some systems. Do you know if an iCloud email address can be changed prior to gaining access to that account like you suggest?

Ultimately there is some element of trust.

Nothing Apple has done with respect to this feature has eroded that trust for me. They have not "created a backdoor". They have not given away my private data. They are looking for a matching hash for CSAM only when I try to send an image to their servers. That's not spying on me. It makes no material difference if that scan happens on my device or theirs.
 
Third paragraph, on this page.


That’s an icee generalized statement. Of course, the info eventually gets used by law enforcement, but a Apple does not deal directly with law enforcement.

They report any findings to the NCEMC who then works with law enforcement.

Apple has no contact with any law enforcement or government agencies unless they decide to speak with Apple as part of an investigation.

Read the detail analysis of how this works.
 
That’s an icee generalized statement. Of course, the info eventually gets used by law enforcement, but a Apple does not deal directly with law enforcement.

They report any findings to the NCEMC who then works with law enforcement.

Apple has no contact with any law enforcement or government agencies unless they decide to speak with Apple as part of an investigation.

Read the detail analysis of how this works.
Jesus. Even when you're pointed to the proof, you still deny it.
 
  • Like
Reactions: lxmeta and Mendota
That can work on some systems. Do you know if an iCloud email address can be changed prior to gaining access to that account like you suggest?

Ultimately there is some element of trust.

Nothing Apple has done with respect to this feature has eroded that trust for me. They have not "created a backdoor". They have not given away my private data. They are looking for a matching hash for CSAM only when I try to send an image to their servers. That's not spying on me. It makes no material difference if that scan happens on my device or theirs.
And you're correct. In its current description, it's fine. Scan the people's phones before it uploads to iCloud and hash it against a database. In the land of unicorns and rainbows, that would be a great solution.

The concern here is for the potential for abuse. Because Apple *will* bend to the will of the government; whether it's ours or Chinas or whoever's. They already did, by agreeing to give China access and control over the iCloud servers in China, and guarantee that all Chinese citizens will use the Chinese controlled iCloud servers.

What happens when China passes a law demanding that all anti-regime propaganda is illegal, and demands that Apple now scan for that illegal propaganda? What happens if the USA, through the Patriot Act, demands that all terroristic propaganda stored in iCloud is also illegal, and shall be scanned and watched for?

It's not a big leap to include non-CSAM hashes in their hash database....

Do you understand now why it could be misused?

Stopping child porn = good idea. Stopping pedophiles = good idea. Agreeing to surveillance of a billion users "Just in case" = bad idea.
 
  • Like
Reactions: lxmeta and Mendota
I know I posted this in another thread... but here is the important part. This is not a requirement by the government towards Apple. This is apple's CHOICE.

Many people have stated that Apple is required to do this, as a service provider.

Someone also linked to the actual law; 18USC2258A.

Here's an interesting part of this. 2258A, section (f)

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-

(1) monitor any user, subscriber, or customer of that provider;

(2) monitor the content of any communication of any person described in paragraph (1); or

(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

Now... read that again carefully. NOTHING in this section shall be construed to *require* a provider to...
... MONITOR ANY USER, SUBSCRIBER, OR CUSTOMER
... MONITOR THE CONTENT OF ANY COMMUNICATION...
... AFFIRMATIVELY SEARCH, SCREEN OR SCAN FOR FACTS OR CIRCUMSTANCES.


That being said, this is a CHOICE by Apple... and NOT A REQUIREMENT. In fact, the law specifically says that they are NOT REQUIRED to scan, monitor or search for CSAM. Just to report it if it is discovered.
 
And you're correct. In its current description, it's fine.
Then let's stop the scaremongering about what hasn't been done, as if this has somehow created that opportunity where it didn't exist before. That isn't the case, the technology required for some forced mass surveillance was not developed by Apple for this feature. It has existed since it has been possible to compare image hashes with computers, before Apple decided to use it to protect children from abuse.

The concern here is for the potential for abuse. Because Apple *will* bend to the will of the government; whether it's ours or Chinas or whoever's. They already did, by agreeing to give China access and control over the iCloud servers in China, and guarantee that all Chinese citizens will use the Chinese controlled iCloud servers.

What happens when China passes a law demanding that all anti-regime propaganda is illegal, and demands that Apple now scan for that illegal propaganda? What happens if the USA, through the Patriot Act, demands that all terroristic propaganda stored in iCloud is also illegal, and shall be scanned and watched for?

It's not a big leap to include non-CSAM hashes in their hash database....
It's not a big leap from iOS 14 (not having this feature) to that. Nothing about government forcing their hand has anything to do with this feature. Some government could demand that regardless of Apple implementing what they have so far. I.e. some government could come around and demand that they do exactly what they have proposed to do - see, it makes no difference.
Do you understand now why it could be misused?
No. Because the misuse does not require this as a first step. It can be done all together if this feature wasn't already there. This feature doesn't change things.
Stopping child porn = good idea. Stopping pedophiles = good idea. Agreeing to surveillance of a billion users "Just in case" = bad idea.
As you agreed, "In its current description, it's fine." In some other form it is something else. This "fine" feature hasn't suddenly enabled some other form that we all agree is a "bad idea". That bad idea feature was always possible before this feature. The argument that this is a step closer doesn't hold up. If some big bad government has to take two steps instead of one to violate your rights, they will.

Edit:
I should add that governments have already done things like forced copiers to not duplicate currency accurately, and printer manufactures already add micro dots that can trace your printouts back to your particular printer... all sorts of scary stuff that uses your own devices to violate your privacy started happening a long time ago. This isn't an instance of that, and if you are worried about that happening you are years too late.
 
Uh-huh, accusing Apple of something they *almost* certainly did... 🙄
Guess you haven't seen the latest news. There's no longer any "almost certainly."

About 30 years. I've been programming professionally since before I graduated from highschool. How long have you been doing it?
I started with hand-coding 8080 machine language, in octal, no less, in 1975.

And you'll have to excuse me for being skeptical of your claim. Nobody who's ever worked in a production software development environment would ever suggest "They just need a little bit of software that isn't difficult to make." Software development, particularly for existing product, is a lot more involved than that--if you're doing it right.

And you're correct. In its current description, it's fine.
Disagree. On-device scanning is not and never will be "fine," in my view. Not for any reason or purpose, by anybody. Ever.
 
Last edited:
  • Like
Reactions: Mendota
Then let's stop the scaremongering about what hasn't been done, as if this has somehow created that opportunity where it didn't exist before. That isn't the case, the technology required for some forced mass surveillance was not developed by Apple for this feature. It has existed since it has been possible to compare image hashes with computers, before Apple decided to use it to protect children from abuse.
And that's kind of my point (see my previous post). This is a DECISION by apple. Not a requirement. In fact, the law that they are using to justify this move specifically states...(and I quote)

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-
(1) monitor any user, subscriber, or customer of that provider;
(2) monitor the content of any communication of any person described in paragraph (1); or
(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

... This law specifically and very clearly states that they do NOT require a provider to do what they are doing.
Apple seems to have ignored this fact....

(Source: https://uscode.house.gov/view.xhtml...lim-title18-section2258A&num=0&edition=prelim)
 
Here is a video that many should check out. One of the key issues pointed out at the start is Apple's own words state that the system will be "expanded" in the future.
 
Guess you haven't seen the latest news. There's no longer any "almost certainly."
Perhaps I haven't. Do you have a link? I did a quick search and didn't find anything pointing to Apple making this statement because some security researchers forced their hand.

And you'll have to excuse me for being skeptical of your claim. Nobody who's ever worked in a production software development environment would ever suggest "They just need a little bit of software that isn't difficult to make."
It's not difficult because it has already been done. The hard part has been figured out. There are neural network models available for this stuff and tools to train on newer data sets. The software would be built on decades of research at this point. It's not difficult with the tools we have these days for a team of professionals to implement this feature.

Software development, particularly for existing product, is a lot more involved than that--if you're doing it right.
Depends a lot on the product and the team. But I think you are reading in a lot more to my "isn't difficult" statement. It's a feature that would be straight-forward for Apple engineers, or any other professionals, to code. I didn't say it was something you would code over your lunch-hour and send out the door.
Disagree. On-device scanning is not and never will be "fine," in my view. Not for any reason or purpose, by anybody. Ever.
I'm not sure if you really mean "not for any reason or purpose." I don't recall the uproar over being able to search for pictures by content like "dog" ... You know, all the image recognition stuff that is done by scanning the images on the phone in a much more intrusive manner than what Apple would do in this case.

If it is the same exact scan that would be done server-side during an iCloud upload, and you're okay with that, what exactly is the objection based on? The end result is exactly the same in either case. Some evil government can pressure Apple to scan for other stuff in the cloud too (if it wasn't all encrypted) - in that case you would have less control.
 
  • Like
Reactions: MozMan68
Jesus. Even when you're pointed to the proof, you still deny it.
If you refuse to read Apple's own report as I suggested and instead make false assumptions based on a generalized statement on a website, here:

Will CSAM detection in iCloud Photos falsely report innocent people
to law enforcement?


No. The system is designed to be very accurate, and the likelihood that the system would incor-
rectly identify any given account is less than one in one trillion per year. In addition, any time an
account is identified by the system, Apple conducts human review before making a report to
NCMEC. As a result, system errors or attacks will not result in innocent people being reported to
NCMEC.



Again, in case you missed it...

Apple conducts human review before making a report to
NCMEC



FYI...the NCMEC is a non-profit organization and is NOT law enforcement and NOT run by the government. Apple does NOT report accounts to law enforcement. The NCMEC MAY and probably WILL report the activity to law enforcement as that is what they do, but Apple does not.

SOURCE
 
That is about the most ludicrous argument I've seen in defense of Apple's privacy invasion plans so far. Well done!
Yet, you cannot explain why. Instead you make a dumb statement.

Apple already does and has done on-device scanning of images for dogs, cats, flowers, plants, places of interest, faces against a hashed database...why weren't you concerned someone could hack that process before. What is so different now?
 
  • Like
Reactions: MarvinK9
Yet, you cannot explain why. Instead you make a dumb statement.
Oh, the irony... You know what they say about assumptions, right?

Apple already does and has done on-device scanning of images for dogs, cats, flowers, plants, places of interest, faces against a hashed database...why weren't you concerned someone could hack that process before.
Oh... I dunno... I guess I'd have to ask what would be the point in hacking such a thing? To mis-classify peoples' photos for them? "Oh no! I wanted to see my photos of plants and instead I got trains! I must've been hacked!" To mine their data for how many photos they take of cats vs. dogs vs. plants vs. buildings vs. people vs. whatever?

But hack something that's meant to be spyware, perhaps cause it to mis-report or fabricate results...? Classify things that were not part of the original intent?

What is so different now?
The difference is one system is employed for the end-user's benefit. To categorize photos for the owner of the device. The other is used to spy upon the user on "their own" device. In the one system there's been no suggestion the results would be exported to third parties--or even to Apple, itself, for review. The other clearly does that.

TBH: I'd hardly thought it necessary to explain why equating one system with the other was specious. Perhaps I over-estimated my audience?
 
Last edited:
Oh, the irony... You know what they say about assumptions, right?


Oh... I dunno... I guess I'd have to ask what would be the point in hacking such a thing? To mis-classify peoples' photos for them? "Oh no! I wanted to see my photos of plants and instead I got trains! I must've been hacked!" To mine their data for how many photos they take of cats vs. dogs vs. plants vs. buildings vs. people vs. whatever?

But hack something that's meant to be spyware, perhaps cause it to mis-report or fabricate results...?


The difference is one system is employed for the end-user's benefit. To categorize photos for the owner of the device. The other is used to spy upon the user on "their own" device. In the one system there's been no suggestion the results would be exported to third parties--or even to Apple, itself, for review. The other clearly does that.

TBH: I'd hardly thought it necessary to explain why equating one system with the other was specious. I guess I over-estimated my audience

The intent doesn’t matter.

The tech is the same, the opportunity is the same…and if you are really worried about hash images or data associated with those hashed that are hard coded into the system by Apple could be compromised in any way, you’ve been watching too many spy movies.

Spyware means someone can actually “spy” due to the changes being made on your device.

If Apple, or anyone else for that matter, can’t see or know what is even happening until YOU upload data to the cloud, how is that spying?
 
  • Like
Reactions: MarvinK9
If Apple is not acting as an agent for law enforcement, then it seems to me that they are breaking even more privacy laws; hacking perhaps. Invasion of privacy.

Just as a person (not law enforcement) wouldn't be allowed to walk in and search your home (they couldn't even get a warrant without being law enforcement), Apple shouldn't be given permission to just peruse your files at their leisure.

I'm not a lawyer (nor do I play one on TV), but I really think that the lawyers at the EFF and even the ACLU need to tap in on this.
I have no doubt one or more lawsuits will be forthcoming once iOS 15 is released and this feature goes live. Forum member @Playfoot noted in one of his posts on the subject, that a court ruled that an entity acting in the same fashion as Apple was held to be an agent of the state. I don't recall the case but, maybe he will repost or link to the information he found.

edited to add: I found the case.

 
Last edited:
  • Like
Reactions: GBaughma
As for Apple scanning pics that is about to be uploaded to iCloud, I have NO problem with it. I presume it is illegal for Apple to be storing child porn images on iCloud servers, they have a responsibility to do this if they can.
I say again....

Many people have stated that Apple is required to do this, as a service provider.

Someone also linked to the actual law; 18USC2258A.

Here's an interesting part of this. 2258A, section (f)

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-

(1) monitor any user, subscriber, or customer of that provider;

(2) monitor the content of any communication of any person described in paragraph (1); or

(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

Now... read that again carefully. NOTHING in this section shall be construed to *require* a provider to...
... MONITOR ANY USER, SUBSCRIBER, OR CUSTOMER
... MONITOR THE CONTENT OF ANY COMMUNICATION...
... AFFIRMATIVELY SEARCH, SCREEN OR SCAN FOR FACTS OR CIRCUMSTANCES.


That being said, this is a CHOICE by Apple... and NOT A REQUIREMENT. In fact, the law specifically says that they are NOT REQUIRED to scan, monitor or search for CSAM. Just to report it if it is discovered.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.