Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So it's trespassing then? ;)
It would be hard to claim trespassing when the user has to agree to the software terms when installing the update. I get your point, (joke) though. :D

Unless it can be shown that Apple's actions violate state and federal law with the new feature that will soon be released, a user will have to mitigate impact (re: not turning on iCloud services), not update to iOS 15, or changing OS.

I do believe we will see lawsuits very quickly.
 
  • Like
Reactions: jntdroid
He doesn’t understand that the wiggle room allowed by the trained AI in matching hashes allows to catch slightly edited versions of the offending CSAM photo but not an unrelated photo.
He just doesn’t get it.
Please just stop. Stop treating me like an idiot. I am seriously frustrated by your damn attitude with me on 5 different threads now.

Its NOT 1:1....proof is provided by Apple
Not being 1:1 means there is some leeway.

Images are a series of individual pixels with RGB values and A values for some supporting filetypes. There can be a combination of RGB(A) values that can make it match without being the actual offending image.

 
  • Like
Reactions: Grey Area
And who is going to check the ones who provide the hashes that the photos are being compared against? Who checks them? I'm not worried about Apple, I'm worried about whatever government gets their hands on what checks are applied. Suddenly it allows them to also search for anything or anybody.

Man, privacy was what's keeping me away from Android. Now Apple does this...?

And the way Apple is doing this, they themselves will have no visibility to databases they are scanning against.
(trying to find link to where I saw that explained)
 
  • Like
Reactions: Philip_S
And who is going to check the ones who provide the hashes that the photos are being compared against? Who checks them? I'm not worried about Apple, I'm worried about whatever government gets their hands on what checks are applied. Suddenly it allows them to also search for anything or anybody.

Man, privacy was what's keeping me away from Android. Now Apple does this...?
Assume a worst-case scenario: <insert bad actor here> injects some hash of <actual picture of something> that gets sent to Apple. Apple updates to iOS 15.x, which includes this hash in the database. Now, if you happen to upload a near-exact match of <actual picture of something>, you account is flagged. Now, Apple reviews this photo --- if it's not child abuse, then they know that the hash of <actual picture of something> is pointing to some photo that doesn't belong in their database. They can then update to iOS 15.x+1 that removes hash of <actual picture of something> so that future owners of <actual picture of something> are not flagged for child abuse review.

In other words, if you trusted Apple up until now to protect your privacy, then it's up to you to trust them in the future. If Apple is a bad actor, then it really doesn't matter if they do this or not --- they could implement a backdoor without publicly announcing it whenever they see fit. However, if Apple is still pro-privacy, then you can be assured that this in no way allows an outside bad actor to manipulate the system.
 
I don't understand why Apple can not do the CSAM scanning server side only and let the end user decide if they want to continue to use iCloud or not. At this point all Apple has done so far is create a backdoor into my device that I have zero control over. At least with Android they are not doing CSAM scanning on your physical device so you can easily opt out by not uploading files to Google's cloud servers.
 
I don't understand why Apple can not do the CSAM scanning server side only and let the end user decide if they want to continue to use iCloud or not. At this point all Apple has done so far is create a backdoor into my device that I have zero control over. At least with Android they are not doing CSAM scanning on your physical device so you can easily opt out by not uploading files to Google's cloud servers.
I think this is because it protects them from hosting CSAM content in the first place. There are also studies of how server-side scanning has a plethora of privacy issues that are mitigated with this technique.

Also, not even close to a backdoor, if you look at what they're actually doing.
 
  • Like
Reactions: giggles
And who is going to check the ones who provide the hashes that the photos are being compared against? Who checks them? I'm not worried about Apple, I'm worried about whatever government gets their hands on what checks are applied. Suddenly it allows them to also search for anything or anybody.

Man, privacy was what's keeping me away from Android. Now Apple does this...?

Suppose a non-CSAM photo is inserted by the US govt in the NCMEC repository.

First of all, the US govt would have to wait for the next iOS update for the database to be updated, so they can’t do time-sensitive targeted searches of a particular individual.

Secondly, they would need to pray that the same iCloud account has enough matching pics to surpass the threshold of multiple matches. (they would need to inject multiple of these “rogue” hashes) This is a supremely inefficient way to search for a particular photo, almost hopeless.

Third, once the non-CSAM rogue photo targeted by the govt escalates to the human review at Apple, Apple’s reviewer would see that the photo doesn’t depict kiddie p0rn but instead a MAGA hat or an antifa protester or whatever. He would then say out loud: “What in the actual f- is going on here?”. Apple would investigate this and stop using NCMEC repository since it’s infiltrated by the US govt.

Then what?

Again, a supremely inefficient way to abuse this system.
 
Apple needs to make one of its fancy dancy videos that solidly explains how this is all going to work. Their current roll-out of this is a PR disaster. It sounds creepy & even after reading about how it works, I'm still not enthralled with it.
Ok, first off let me say I do not support apple’s change but I do have some understanding of how this works because I’ve been involved in child porn cases as a defense attorney. The majority of child porn prosecutions are for know material. The material all has an assigned “hash value” which is a unique string of letters and numbers. Regardless of the file name the hash value remains the same for known images. When the cops look for child porn they really only looking for the known hash values. Once they find a file that has the unique known hash value they verify it’s contraband and track the image back to an IP address.

Apple seems to be scanning for the has values only, not looking at images. The cops need a warrant to go to the isp and get an ip addresses physical address Apple is skipping that part and giving user info for people who have those hash values to law enforcemen.
 
I think this is because it protects them from hosting CSAM content in the first place. There are also studies of how server-side scanning has a plethora of privacy issues that are mitigated with this technique.

Also, not even close to a backdoor, if you look at what they're actually doing.
If that was the case all the cloud storage providers would be in legal trouble. I would say device side scanning is a bigger privacy issue compared to server side scanning. It is a backdoor into my phone since it enables Apple to monitor the device and at anytime could expand the parameter's without the end user knowing.
 
Suppose a non-CSAM photo is inserted by the US govt in the NCMEC repository.

First of all, the US govt would have to wait for the next iOS update for the database to be updated, so they can’t do time-sensitive targeted searches of a particular individual.

Secondly, they would need to pray that the same iCloud account has enough pics to surpass the threshold of multiple matches. (they would need to inject multiple of these “rogue” hashes) This is a supremely inefficient way to search for a particular photo, almost hopeless.

Third, once the non-CSAM rogue photo targeted by the govt escalates to the human review at Apple, Apple’s reviewer would see that the photo doesn’t depict kiddie p0rn but instead a MAGA hat or an antifa protester or whatever. He would then say out loud: “What in the actual f- is going on here?”. Apple would investigate this and stop using NCMEC repository since it’s infiltrated by the US govt.

Then what?

Again, a supremely inefficient way to abuse this system.
Even more, it wouldn't be just some MAGA hat or <whatever>, it would need to be a near-exact match of the photo that was hashed.
 
  • Like
Reactions: giggles
If that was the case all the cloud storage providers would be in legal trouble. I would say device side scanning is a bigger privacy issue compared to server side scanning. It is a backdoor into my phone since it enables Apple to monitor the device and at anytime could expand the parameter's without the end user knowing.
Bingo --- cloud storage providers have legal(financial) incentive to not store this stuff. Just because you say device-side scanning is a bigger privacy issue doesn't mean it is. There is peer-review research regarding these issues.. Again, a backdoor this is not --- look up neural hash to see how the network works, they cannot just change/expand parameters for this to surveil everything you do.

What's more important: if you used to think Apple cared about your privacy, then this should in no way change that. This has no bearing on whether or not Apple is trustworthy or not, just that this does not enable a backdoor in any way whatsoever -- Apple still holds the keys.
 
Assume a worst-case scenario: <insert bad actor here> injects some hash of <actual picture of something> that gets sent to Apple. Apple updates to iOS 15.x, which includes this hash in the database. Now, if you happen to upload a near-exact match of <actual picture of something>, you account is flagged. Now, Apple reviews this photo --- if it's not child abuse, then they know that the hash of <actual picture of something> is pointing to some photo that doesn't belong in their database. They can then update to iOS 15.x+1 that removes hash of <actual picture of something> so that future owners of <actual picture of something> are not flagged for child abuse review.

In other words, if you trusted Apple up until now to protect your privacy, then it's up to you to trust them in the future. If Apple is a bad actor, then it really doesn't matter if they do this or not --- they could implement a backdoor without publicly announcing it whenever they see fit. However, if Apple is still pro-privacy, then you can be assured that this in no way allows an outside bad actor to manipulate the system.
Sorry, but that's a bit naive, I think. China won't allow Apple in California to review the images of state-designated terrorists in Taiwan. They'll very likely have a backdoor to the software Apple provides on the phone to upload new hashes to. What do you think intelligence services are like around the world? They will absolutely drool over this kind of tech. Every photo is going to be analyzed for bad actors and objects.

I'm a software engineer and I am very aware how certain loopholes can "find their way" into software. If it leaks, they'll apologize. But good luck figuring out what the encrypted data going over your 5G-connection really is. Gee, a few hundred bytes of data from the Apple server. Nothing to worry about! Unless there's something to worry about.

What about the US government? They are known to do crazy things. The Tonkin accident is a famous one, "weapons of mass destruction in Iraq" a more recent one. The world governments are full of power-hungry and untrustworthy bad actors.

Today, they're looking for missing children. Tomorrow, missing demented elderly people. Next week, terrorists. Next month, wanted criminals. Next year, well gosh, China demands that Apple sends positive hits directly to them; here's 100MB of hashes to check. And the next day a few thousand Uyghurs are taken away, never to be heard from again.

It's a slippery slope argument because it's a big scary slippery slope.

We don't know who is in charge, who is responsible, and nobody is making sure we can trust those people. And while that's the biggest issue, in many countries in the world where Apple sells their things, you can't even expect a government to be trusted to have an open and honest panel of reviewers to review the reviewers.

China, Russia, Kazachstan, Iran, Venezuela, Algeria, Congo, Belarus, Saudi Arabia, the UAE, and so many more.

This is ridiculously scary.
 
I'm tempted to hack into CSAM database to upload a bunch of pictures of myself shirtless with "Hi Apple" written on my chest. I'll upload all of them to iCloud some time from then when I hope Apple has updated iOS to include hashes of my photos. If only I could find a way to be a fly on the wall in the reviewer room when I'm flagged...
 
  • Like
Reactions: jntdroid
This is part of what gets me about all of this. It's an opt-out "feature" for the criminals. It's going to do very little to help slow them down.
Apple’s main and legally-mandated goal is keeping that crap off their servers, not being the world police and solving crime once and for all.

Emphasis on legally-mandated.
 
  • Like
Reactions: sog1927
Bingo --- cloud storage providers have legal(financial) incentive to not store this stuff. Just because you say device-side scanning is a bigger privacy issue doesn't mean it is. There is peer-review research regarding these issues.. Again, a backdoor this is not --- look up neural hash to see how the network works, they cannot just change/expand parameters for this to surveil everything you do.

What's more important: if you used to think Apple cared about your privacy, then this should in no way change that. This has no bearing on whether or not Apple is trustworthy or not, just that this does not enable a backdoor in any way whatsoever -- Apple still holds the keys.
It is a backdoor since the software is scanning the device for photo hashes to see if any match a list of provided hashes that is constantly updated. That scanning can be expanded by Apple policy or court order with a simple software update that the user would not know about.
 
Sorry, but that's a bit naive, I think. China won't allow Apple in California to review the images of state-designated terrorists in Taiwan. They'll very likely have a backdoor to the software Apple provides on the phone to upload new hashes to. What do you think intelligence services are like around the world? They will absolutely drool over this kind of tech. Every photo is going to be analyzed for bad actors and objects.

I'm a software engineer and I am very aware how certain loopholes can "find their way" into software. If it leaks, they'll apologize. But good luck figuring out what the encrypted data going over your 5G-connection really is. Gee, a few hundred bytes of data from the Apple server. Nothing to worry about! Unless there's something to worry about.

What about the US government? They are known to do crazy things. The Tonkin accident is a famous one, "weapons of mass destruction in Iraq" a more recent one. The world governments are full of power-hungry and untrustworthy bad actors.

Today, they're looking for missing children. Tomorrow, missing demented elderly people. Next week, terrorists. Next month, wanted criminals. Next year, well gosh, China demands that Apple sends positive hits directly to them; here's 100MB of hashes to check. And the next day a few thousand Uyghurs are taken away, never to be heard from again.

It's a slippery slope argument because it's a big scary slippery slope.

We don't know who is in charge, who is responsible, and nobody is making sure we can trust those people. And while that's the biggest issue, in many countries in the world where Apple sells their things, you can't even expect a government to be trusted to have an open and honest panel of reviewers to review the reviewers.

China, Russia, Kazachstan, Iran, Venezuela, Algeria, Congo, Belarus, Saudi Arabia, the UAE, and so many more.

This is ridiculously scary.

Very well said. I always think it’s funny when people immediately cite China instead of the US, particularly after the administration we just (narrowly) escaped had been caught repeatedly breaking the law and flagrantly lying about it.
 
Sorry, but that's a bit naive, I think. China won't allow Apple in California to review the images of state-designated terrorists in Taiwan. They'll very likely have a backdoor to the software Apple provides on the phone to upload new hashes to. What do you think intelligence services are like around the world? They will absolutely drool over this kind of tech. Every photo is going to be analyzed for bad actors and objects.

I'm a software engineer and I am very aware how certain loopholes can "find their way" into software. If it leaks, they'll apologize. But good luck figuring out what the encrypted data going over your 5G-connection really is. Gee, a few hundred bytes of data from the Apple server. Nothing to worry about! Unless there's something to worry about.

What about the US government? They are known to do crazy things. The Tonkin accident is a famous one, "weapons of mass destruction in Iraq" a more recent one. The world governments are full of power-hungry and untrustworthy bad actors.

Today, they're looking for missing children. Tomorrow, missing demented elderly people. Next week, terrorists. Next month, wanted criminals. Next year, well gosh, China demands that Apple sends positive hits directly to them; here's 100MB of hashes to check. And the next day a few thousand Uyghurs are taken away, never to be heard from again.

It's a slippery slope argument because it's a big scary slippery slope.

We don't know who is in charge, who is responsible, and nobody is making sure we can trust those people. And while that's the biggest issue, in many countries in the world where Apple sells their things, you can't even expect a government to be trusted to have an open and honest panel of reviewers to review the reviewers.

China, Russia, Kazachstan, Iran, Venezuela, Algeria, Congo, Belarus, Saudi Arabia, the UAE, and so many more.

This is ridiculously scary.
Aside from the fact that this seems like a far-left-field rant, you clearly don't understand how the neural hash network functions. As a software engineer, you should have little difficulty reading the paper on it and understanding how your whole premise is false.

The only real issue with this new feature is if Apple is a bad actor or not --- that's up to each of us to gamble on for ourselves, which many of us have already been doing.
 
They can keep it off of their servers without looking for it on my phone.

Apple’s not looking into your phone.

An inanimate process with no connection to Apple servers is. It will not communicate with Apple till the end of times, well after we are all dead, if you don’t upload those pictures to Apple’s servers.
 
It is a backdoor since the software is scanning the device for photo hashes to see if any match a list of provided hashes that is constantly updated. That scanning can be expanded by Apple policy or court order with a simple software update that the user would not know about.
"Simple software update" would be complete tear down and rebuild from the ground up. Which they could do theoretically do at any moment of any day. If you're worried that Apple is going to comply with government court orders of this regard, then OK, you should stop using iPhone (and every other smart phone). If you think that this implementation is primed for this, then you don't understand how it works.
 
  • Like
Reactions: giggles
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.