Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Once the scanning feature is there, it just needs some small changes to target anyone the gov wants, not just criminals.

The same is true of the Photo app. It has been on devices since 2007 and just some small changes and it can send information about _interesting_ photos much more effectively to the government than the CSAM detection system.

And yet, it has happened.

Why would a government use an ineffective system, when Photos are much more effective?
 
Ok, first off let me say I do not support apple’s change but I do have some understanding of how this works because I’ve been involved in child porn cases as a defense attorney. The majority of child porn prosecutions are for know material. The material all has an assigned “hash value” which is a unique string of letters and numbers. Regardless of the file name the hash value remains the same for known images. When the cops look for child porn they really only looking for the known hash values. Once they find a file that has the unique known hash value they verify it’s contraband and track the image back to an IP address.

Apple seems to be scanning for the has values only, not looking at images. The cops need a warrant to go to the isp and get an ip addresses physical address Apple is skipping that part and giving user info for people who have those hash values to law enforcemen.
Oh I completely understand how it works. It's just we're living in some really weird, unsure times. A lot of people, myself included, are wary there's going to come a point in time where the government is going to start using its power to force tech companies into other areas using this technology - like hunting down political opposition. Yeah, I know, Apple will say they won't do this, but President Joe Biden just the other day ignored a Supreme Court ruling on the eviction moratorium as unconstitutional. So we've been staring down into a black hole. The question is can we avoid the event horizon? That's why this makes me feel uneasy.
 
Without a change to the project or better explanation of how this can’t be turned into a slippery slope, I’ll be moving as much of my content off iCloud as possible and my trust in Apple will be significantly damaged. Every part of this announcement and rollout has been a dumpster fire.
 
what do u think a derivative is. its a copy.

No. Just, no. Factually wrong. Derivative is *NOT* a copy. It is *derived* from the source.

if they look at or scan any of your photos in any form, derivative or whatever, whether its a correct match or not, its an invasion of privacy.

If so, you never had any privacy to begin with considering Spotlight on Mac and iPhone scans your entire device.

also apple do not tell u what the threshold is. a high threshold could be 0 photos or 1 photo or it could be 2 or it could be 10. regardless, any threshold is unacceptable.

Threshold is enough to result in 1 in 1 trillion odds of an error happening.
So even if it is just 2 photos that crosses the threshold, the accuracy is so good that it'll virtually never happen to most customers in their lifetime.

any scanning or review of our private photos is a total invasion of privacy.

Spotlight already does scan for it. Photos app already scans on device facial recognition. You had no privacy to begin with. This new feature changes nothing with regards to privacy.

I'm done arguing with you. You've already made up your mind to believe your false statements so what's the point in continuing this conversation with you?
 
https://www.hackerfactor.com/blog/index.php?/archives/929-One-Bad-Apple.html
In addition, review the late 2020 and 2021 peer reviewed papers on hash based adversarial attacks and examples. Simple Google Scholar search will yield everything you need.

The person seems to miss a lot of details.

A. 1 in 1 trillion is for accounts, not for hash collisions. He goes on and on about the need to test this with 1 trillion photos which is not how they would achieve such a high number.
B. He doesn't under stand there are several different systems here, one for CSAM detection and one for discovering nudity in Messages.
C. He thought Apple would scan all photos not the only ones in iClout Photo Library.
D. He thought that Apple couldn't decrypt iCloud Photo Library.
 
Those focusing on privacy missed the fact that you never had any privacy on iCloud Photos to begin with.

That statement is particularly amusing considering Apple themselves were the ones focusing on privacy, weren't they?

Screen Shot 2021-08-10 at 7.44.02 PM.png
 
Last I checked, Messages are still end to end encrypted. And iCloud can be turned off so that "What happens on your iPhone, stays on your iPhone" tagline remains true.

Amusing indeed.
Even if you turn off iCloud, the spyware/hash scan system will still be included in spyOS 15. Apple just can't use the results of the scan without iCloud- yet.
 
That makes zero sense.

Only if you don't care about Apple deciding to use your hardware to let them play police.

I'm exaggerating obviously... but that's the gripe some of us have. We get the technology. We get the perceived benefit of them doing it this way. We don't think they're doing anything nefarious. But we simply don't want them to use our device for it - keep that work on their servers.
 
Even if you turn off iCloud, the spyware/hash scan system will still be included in spyOS 15. Apple just can't use the results of the scan without iCloud- yet.

It won't run unless you turn on iCloud

"Apple is also reinforcing that if a user does not use iCloud Photos, then no part of the CSAM detection process runs. This means that if a user wants to opt out of the CSAM detection process, they can disable iCloud Photos."
 
It won't run unless you turn on iCloud

"For now"

The tool being installed on the device is the problem.

All that's stopping it being used for other things (or for all photos, not even iCloud ones) is a simple policy change by Apple (perhaps simply being compelled by a local authority, agency or government).

Creating the tool and installing it on users devices - at all - is the issue.
 
It won't run unless you turn on iCloud

"Apple is also reinforcing that if a user does not use iCloud Photos, then no part of the CSAM detection process runs. This means that if a user wants to opt out of the CSAM detection process, they can disable iCloud Photos."

That was good to hear today. At least the iCloud Photo toggle sounds like a true "on/off" switch for this functionality.
 
Only if you don't care about Apple deciding to use your hardware to let them play police.

I'm exaggerating obviously... but that's the gripe some of us have. We get the technology. We get the perceived benefit of them doing it this way. We don't think they're doing anything nefarious. But we simply don't want them to use our device for it - keep that work on their servers.

I mean scanning every single photo on their end would add costs and naturally those costs would almost certainly be passed on to the consumer. Here we are asking for more than 5GB of free storage, but also asking more processing happening on their servers too is asking a bit much.

I don't know how much this would affect the battery life, but I suspect the neural engine in the chip would keep energy usage low.
 
"For now"

The tool being installed on the device is the problem.

All that's stopping it being used for other things (or for all photos, not even iCloud ones) is a simple policy change by Apple (perhaps simply being compelled by a local authority, agency or government).

Creating the tool and installing it on users devices - at all - is the issue.

Don't really understand the point in being paranoid of potential actions in the future. You have no idea what Apple is going to do in the future. I mean on the other hand, Daring Fireball considers this possibly the first step in enabling end-to-end encryption on all iCloud data which would be a huge privacy win.


Which in turn makes me wonder if Apple sees this initiative as a necessary first step toward providing end-to-end encryption for iCloud Photo Library and iCloud device backups. Apple has long encrypted all iCloud data that can be encrypted,2 both in transit and on server, but device backups, photos, and iCloud Drive are among the things that are not end-to-end encrypted."

Sure he could be wrong. But you could be too. So there's no point in guessing what Apple might do with this.
 
  • Like
Reactions: januarydrive7
I mean scanning every single photo on their end would add costs and naturally those costs would almost certainly be passed on to the consumer. Here we are asking for more than 5GB of free storage, but also asking more processing happening on their servers too is asking a bit much.

I don't know how much this would affect the battery life, but I suspect the neural engine in the chip would keep energy usage low.

I'm sure you're right - though other cloud companies already do this? So I don't know how much cost it would add.

I don't think it will affect battery life much at all on the device.

I know it might seem silly, but it's just the principle of them using my device for it and not giving me a choice in the matter, other than disabling a feature I highly value as my family has 200+ GB of photo/video history on iCloud Photos between the 5 of us.
 
  • Like
Reactions: ececlv
Every tool can and will be used for nefarious purposes. No need for step by step guide. No need to nitpick wording and argue otherwise.

Also, if private company like Apple is all out on mass surveillance, the government would more than likely to offset their burden of law to private companies and change the law to take full advantage of that.

It’s just unfortunate that there is literally no way for Apple or other companies to fall back this level of surveillance and all android manufacturers will follow up very quickly, ramping up the surveillance war while every single customer will be the victim sooner or later.

And I fear no amount of PR damage would cause Apple to roll back this surveillance software. They have released multiple articles showing commitment with no sign of backing down. And sales drop will be very minor (if happening at all) compared to last year given most parents would rejoice anyways.

The next thing to look into is how bad this can be when balance is already lost. Apple will be fine either way, and most Apple user (compulsory or voluntary) will have no choice but to offer unconditional surrender.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.