Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So many people are claiming Apple must do this scan because of US Law. However, I strongly recommend people to listen to the video kindly supplied by SCVRX. I read the code, but listening might easier for many.

In summary, 18USC 2258A clearly states a company must act on "actual knowldege". Moreover, a company is "not required to monitor", "not required to affirmatively search, screen or scan files and data".

So, the question is why?
 
Processing capabilities have being going back and forth for 50 years from the mainframe to clients, then back to servers and clouds, now some capabilities back to local devices, etc. let’s not make too much drama of technicalities and implementations. Yes some processing power from the local device is now part of the crime-surveillance apparatus. Big deal. That’s the least of the problems. Science is science, tech is tech, it’s about how it’s used. Let’s wait for some hint of actual nefarious use at least to get angry for real.
Precisely, but honestly by the time there is a problem it will be too late. It is what we do now that matters. Actually I worry we've already gone past the point of no return.
 
Precisely, but honestly by the time there is a problem it will be too late. It is what we do now that matters. Actually I worry we've already gone past the point of no return.
We are already past this point. I assume that most of the people don't know at all. They will heart the soundbite with "saving the children" and all will be "fine and dandy".

I switched successfully this week all production Macs to Arch linux. My engineers are understanding and are willing to put the effort. I don't understand at all how businesses will accept the idea of active scanning in macOS, but hey this is the new "digital" world.

World in which I don't want to belong at all. So I will stick to my old-fashioned definition of privacy with clear meaning:
A state in which one is not observed or disturbed by other people.
 
So many people are claiming Apple must do this scan because of US Law. However, I strongly recommend people to listen to the video kindly supplied by SCVRX. I read the code, but listening might easier for many.

In summary, 18USC 2258A clearly states a company must act on "actual knowldege". Moreover, a company is "not required to monitor", "not required to affirmatively search, screen or scan files and data".

So, the question is why?
They have gone mad.
 
No they can't. If the backup is encrypted with a password set by the user Apple has no access. Prove me wrong.
If you are using standard iCloud set up, Apple can decrypt almost all of the stuff if it wants to. If you are encrypting the data yourself using a third party system and then uploading it to iCloud, in theory they cannot read the data then.
 
If you are using standard iCloud set up, Apple can decrypt almost all of the stuff if it wants to. If you are encrypting the data yourself using a third party system and then uploading it to iCloud, in theory they cannot read the data then.
Standard iCloud? Wtf is that? No. When you have 2FA turned on nobody but you can access your data and only then on devices you own. It is encrypted end to end and on the the server. That the entire point of why this on-phone scanning is a big deal.

You don't know what you're taking about. Don't believe me. Believe Apple if you want.

"This means that only you can access your information, and only on devices where you’re signed into iCloud. No one else, not even Apple, can access end-to-end encrypted information."​


 
so does this apply to only iCloud pictures or pictures stored on the regular memory of your phone? Or both? Either or that’s a big invasion of privacy and wouldn’t trust what else could be tracked in the future they’d say
 

Lots of interesting news in here, including that the hash database was already embedded in iOS 14.3.
That is certainly interesting stuff. Especially about the hash collisions. (it's easier to get a match than I would have thought) And I really hate that it's been there since 14.3!

People were right when they said it could have been done before.
 
Standard iCloud? Wtf is that? No. When you have 2FA turned on nobody but you can access your data and only then on devices you own. It is encrypted end to end and on the the server. That the entire point of why this on-phone scanning is a big deal.

You don't know what you're taking about. Don't believe me. Believe Apple if you want.

"This means that only you can access your information, and only on devices where you’re signed into iCloud. No one else, not even Apple, can access end-to-end encrypted information."​


Err you can link to Apple as much as you like, the fact is that Apple gave access to the FBI in the past.



Try again fanboy.
 
  • Like
Reactions: GBaughma and snek
so does this apply to only iCloud pictures or pictures stored on the regular memory of your phone? Or both? Either or that’s a big invasion of privacy and wouldn’t trust what else could be tracked in the future they’d say

It applies to any photo which is uploaded to iCloud, so if you have iCloud photos turned on, just as the photo is about to be uploaded, the photo will be scanned. The result of the scan and the photo will be uploaded to the cloud. If you turn off iCloud the photo will not be scanned.
 
Err you can link to Apple as much as you like, the fact is that Apple gave access to the FBI in the past.



Try again fanboy.
Unless I am missing something, this is how it is supposed to work. A reasonably public process, warrants, investigation, etc.

Somewhat different than what Apple will do beginning with OS15. I can't help but wonder why Apple at the very least might not provide a message, "scanning files your files" while the scan is in progress, or prior to starting the scan. Heck Apple tells me it is scanning files when I clear the trash bin....
 
Err you can link to Apple as much as you like, the fact is that Apple gave access to the FBI in the past.



Try again fanboy.
Apple can of course give access to the FBI in cases where encryption is not in use or if the FBI has somehow found the passcode. If its encrypted and the keys are lost, nobody is getting the data. Not even in your dreams.

Maybe that's the fictional "standard" iCloud you were imagining. So yep, you're still wrong and and you can jam that fanboi BS right up your backside. You're an dilettante Apple antagonist compared to me.
 
Apple can of course give access to the FBI in cases where encryption is not in use or if the FBI has somehow found the passcode. If its encrypted and the keys are lost, nobody is getting the data. Not even in your dreams.

Maybe that's the fictional "standard" iCloud you were imagining. So yep, you're still wrong and and you can jam that fanboi BS right up your backside. You're an dilettante Apple antagonist compared to me.
Apple holds the keys, it is not possible for them to be "lost". Apple gave access to an iCloud account, including photos. These are encrypted on Apple servers, but Apple has the key to decrypt them.

You are an aggressive fool. I looked at all your posts, you are rude and an a*sehole in every single one. Bugger off. Something must be small eh?
 
  • Like
Reactions: GBaughma
And to think PRSI was closed because some didn't like the atmosphere....

Anyway the bottom line is that for some this is OK because they want to stop child abuse. I understand that, and I think everybody does. The question is whether this is the right way to do this. Many people believe it is not, so stop telling them they don't understand, they're paranoid, etc. There are legitimate concerns with what Apple proposes to do. The problem is that Apple cannot be perfectly transparent about the process (e.g., what the hash extracts from the photos) because then pedophiles could defeat Apple's spyware. So they are caught in a Catch-22 of their own making. And so are we if we stay with Apple's products and they carry out this lunacy.
 
If they are just scanning the hash couldn't the abuser not just modify the image slightly (like adding a pixel somewhere) and they would be clear?
It's a neural hash, not a checksum. Just as your eye can tell a high-res photo and a low-res photo of the same thing ARE the same thing... or it's the same thing with the sky photoshopped to green instead of blue, that's how the hash works.
 

Researchers claim they have probed a particular part of Apple's new system to detect and flag child sexual abuse material, or CSAM, and were able to trick it into saying two images that were clearly different shared the same cryptographic fingerprint. But Apple says this part of its system is not supposed to be secret, that the overall system is designed to account for this to happen in general, and that the analyzed code is not the final implementation that will be used with the CSAM system itself and is instead a generic version.
 

Researchers claim they have probed a particular part of Apple's new system to detect and flag child sexual abuse material, or CSAM, and were able to trick it into saying two images that were clearly different shared the same cryptographic fingerprint. But Apple says this part of its system is not supposed to be secret, that the overall system is designed to account for this to happen in general, and that the analyzed code is not the final implementation that will be used with the CSAM system itself and is instead a generic version.
It would explain why Apple has the manual review threshold at 30 if this system is easily triggered with false positive's.
 
  • Like
Reactions: Playfoot
Apple cannot scan at the server end if they decide to start E2EE for iCloud Photos. I think this development will allow them to start E2EE, which will further enhance user privacy.

My take is that Apple is doing this to comply with local laws. If in your hypothetical situation where any countries made it a legal responsibility to perform mass surveillance, Apple will not be the only one affected. Everyone operating in that country will be affected, because the law dictated it.

Apple doesn't get any benefits at all for doing it unilaterally. I think this point is important when discussing such issues.

If anyone already made up their mind that Apple is acting on nefarious reason without asking why, it's difficult to have any meaningful debate.

To me, this development is just Apple complying with US laws but yet still intent on advancing user privacy with proper E2EE for iCloud Photo. Apple is not being altruistic. Making their device and services privacy focused is a differentiating factor that helps them sells more. If making more profits is not their objective, then Apple have no reason to exists.
Nonono.... Apple is not complying with US Laws. They are taking the law into their own hands. In fact, the US law specifically has privacy statements stating that a provider should NOT actively scan, discover, etc. this material.

Meanwhile, a thought that I had regarding E2EE.... so if Apple says "We only review your images if you get 30 hits on the CSAM database...." ... wait... how are they reviewing your images? This indicates to me that Apple has a master decryption key (which I would expect that they would have) and that scanning COULD be done on the iCloud servers.

Either it's encrypted and nobody (including Apple) can get to the encrypted files, or Apple can review them without your permission (which gets triggered with 30 "hits"). It can't be both ways.

We are not being told all of the facts here.
 
  • Like
Reactions: Pummers and VulchR
Nonono.... Apple is not complying with US Laws. They are taking the law into their own hands. In fact, the US law specifically has privacy statements stating that a provider should NOT actively scan, discover, etc. this material.

Meanwhile, a thought that I had regarding E2EE.... so if Apple says "We only review your images if you get 30 hits on the CSAM database...." ... wait... how are they reviewing your images? This indicates to me that Apple has a master decryption key (which I would expect that they would have) and that scanning COULD be done on the iCloud servers.

Either it's encrypted and nobody (including Apple) can get to the encrypted files, or Apple can review them without your permission (which gets triggered with 30 "hits"). It can't be both ways.

We are not being told all of the facts here.
Part of the problem is that Apple cannot give us all the details or otherwise pedophiles will learn figure out how to defeat the system. Indeed this is going to spark and arms race of sorts between the pedophiles and the authorities. The casualty will be our privacy.
 
Everyone does do this already, but they scan ALL OF YOUR PHOTOS. Apple is matching an incomplete set/partial set of 1's and 0's (the beginning strings of all the kiddie porn in the database) on your phone when the device stats to upload to iCloud.... when it hands off to iCloud that cloud server is matching the other half and if both halves match in that literal second, then, and only then, does your identifiable info flag. And if that happens.... you're a freaking pedophile. If you're not a freaking pedophile, it will never happen. They also don't report you unless you have a rather large sum of kiddie porn matches (30.... which is kind of a lot of kiddie porn).

Facebook, Google, Microsoft, and every other photo site looks at all your pics daily and reports you for kiddie porn. Apple's solution lets them not ever look at your photos at all unless you're a pedo.

I don't get why people, especially tech people, are so upset over this unless they didn't read/listen/watch the clarity on how it works.

And the texting feature....how is it bad to make sure you're 11 year old isn't sexting the universe or that some pervert isn't sexting his junk to your little kid?
I am a technical person. System administrator for 30+ years, programmer for 35+ years. I've developed systems that are still in use today.

I get how it works. I also get how easily it could be abused.

As I've stated before, would you be comfortable with someone installing a camera in your home, with the promise that it will ONLY be used to detect child abuse, and nobody would ever look at your spouse in their underwear? Let's take it a step further... would you be OK with someone installing a camera in your house, and it would only be reviewed if you did something illegal? Always on... always recording... but hey, you were PROMISED it would never be used to spy on you or your spouse, or even your kids for that matter. Hey, you don't do anything illegal right (like, smoke pot, eat chicken with a knife and fork [Georgia], taking a crawfish off of someone else's plate [10 years in jail in Louisiana], have a pet rat [montana], feed your cat or dog less than once a day [Pennsylvania], share your Netflix password [Tennessee]).. I mean, you never break ANY law, right? So why should you care if someone puts a camera in your home?

... Other than the invasion of privacy on a promise that it would only be used for one thing, right?

As far as the texting feature, when does it become Apple's job to be a digital parent and tattle on your kids? A real parent should be reviewing their kid's phone... do you have the password for your kid's phone? Do you review what apps and messages they're sending? Who should monitor the children? The answer should be apPARENT, after all.
 
As far as the texting feature, when does it become Apple's job to be a digital parent and tattle on your kids? A real parent should be reviewing their kid's phone... do you have the password for your kid's phone? Do you review what apps and messages they're sending? Who should monitor the children? The answer should be apPARENT, after all.

I agree with you 100%... but if Apple doesn't trust us even to remember to stand up or even breathe... (AppleWatch) is only logical that we can't be trusted to be good parents ;)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.