Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Not sure if you're being obtuse or just want to argue. In the context of Apple's implementation, it is clear they are trying to say that the same image that is cropped/resized/colour changes but still looked similar to the original image, which was supported by example by Apple. Taking the discussion out of context doesn't help in any discussion.

In any case, it has also been explained by other posters that in case of a false positive, the last line of defend is manual verification (which I would think would be extremely rare for a false positive) before reporting to NCMEC. I assume you have read those postings?

Anyway, not sure why you are answering on behalf of the poster I'd posted the question to.
 
Not sure if you're being obtuse or just want to argue. In the context of Apple's implementation, it is clear they are trying to say that the same image that is cropped/resized/colour changes but still looked similar to the original image, which was supported by example by Apple. Taking the discussion out of context doesn't help in any discussion.
That's not what it says, it says similar images, and there's a lot of ways that can happen. It also says the other to make people like yourself more comfortable with the concept. It's true, a match would be very rare given the parameter that we have been told about, but those parameters can be changed quite easily. It's actually based on perceptual hashing and just allowing more differences is all neuralhash needs to match photos with only similar features.

In any case, it has also been explained by other posters that in case of a false positive, the last line of defend is manual verification (which I would think would be extremely rare for a false positive) before reporting to NCMEC. I assume you have read those postings?
The trick here is I don't trust Apple to do what they say now. The act of putting the scanner/hashing/matching on the phone made sure of that. And others thinking the algorithms/hash db wont change once it's there are only deluding themselves.
 
  • Like
Reactions: dk001
That's not what it says, it says similar images, and there's a lot of ways that can happen. It also says the other to make people like yourself more comfortable with the concept. It's true, a match would be very rare given the parameter that we have been told about, but those parameters can be changed quite easily. It's actually based on perceptual hashing and just allowing more differences is all neuralhash needs to match photos with only similar features.
You're right in that its based on perceptual hashing (likely due to a cursory reading of Apple's documentation). What you're clearly confused about is that you don't understand what perceptual hashing is. It doesn't "say the other to make people... more comfortable with the concept." They are explaining, in laymen's terms, the general idea behind the concept.

Here's a well-cited paper covering the topic: https://ieeexplore.ieee.org/document/1709989, and here's a well-cited work that does a fairly good job defining (sometimes conflicting) goals of perceptual hashing functions: http://phash.org/docs/pubs/thesis_zauner.pdf

Here's a tl;dr: a perceptual hashing algorithm is one that provides an efficient means of finding matches of exact images, even in the presence of alterations on those images. They are not designed for generalizing to match similar images in the sense that they share certain visual characteristics, or even to images containing similar features. There are classes of algorithms that do those things, but the whole point of a perceptual hash is that it can distinguish true matches from those that are very close to originals.

The trick here is I don't trust Apple to do what they say now. The act of putting the scanner/hashing/matching on the phone made sure of that. And others thinking the algorithms/hash db wont change once it's there are only deluding themselves.
Delusion is a funny thing; it's often we see others as deluded when we ourselves have misappropriated facts.




Edit: added a secondary resource
 
Last edited:
  • Like
Reactions: quarkysg
You're right in that its based on perceptual hashing (likely due to a cursory reading of Apple's documentation). What you're clearly confused about is that you don't understand what perceptual hashing is. It doesn't "say the other to make people... more comfortable with the concept." They are explaining, in laymen's terms, the general idea behind the concept.

Here's a well-cited paper covering the topic: https://ieeexplore.ieee.org/document/1709989

Here's a tl;dr: a perceptual hashing algorithm is one that provides an efficient means of finding matches of exact images, even in the presence of alterations on those images. They are not designed for generalizing to match similar images in the sense that they share certain visual characteristics, or even to images containing similar features. There are classes of algorithms that do those things, but the whole point of a perceptual hash is that it can distinguish true matches from those that are very close to originals.


Delusion is a funny thing; it's often we see others as deluded when we ourselves have misappropriated facts.
I understand exactly what it is. This part of the conversation is over, personal insults are out of bounds.
 
I understand exactly what it is.
You're not alone: a lot of people here seem to understand exactly what it is without any qualifications whatsoever. It would be nice if any of the self-proclaimed experts here would at least point to some evidence for their theories.

This part of the conversation is over,
You're welcome to disengage from a conversation you stirred up. All I did was provide some clear evidence that contradicted your claims 🤷‍♂️

personal insults are out of bounds.
Where precisely did I personally insult anyone?
 
However, still the same: do not install ios 15 or monterey.
Would be smart to wait for the sleuths to determine if the Code was removed when the gold master gets released, however Apple is probably aware of what would happen if they left it in there so I don’t expect it will be there…. Trust is pretty much gone now though
 
  • Like
Reactions: dk001 and baypharm
Would be smart to wait for the sleuths to determine if the Code was removed when the gold master gets released, however Apple is probably aware of what would happen if they left it in there so I don’t expect it will be there…. Trust is pretty much gone now though
I don’t understand your logic. Apple has not released the code yet, and security researchers haven’t been able to pick it apart. Apple has been open and honest about what their plans were, yet they lost your trust?

They lost your trust long before this I reckon.
 
  • Disagree
Reactions: blueflame
I don’t understand your logic. Apple has not released the code yet, and security researchers haven’t been able to pick it apart. Apple has been open and honest about what their plans were, yet they lost your trust?

They lost your trust long before this I reckon.
not me. they lost my trust hard and square right here.

This scanning is a whole other thing taking it onto my device. I do not care the intentions. From now and until forever, it (products I purchase) should ONLY be designed to work for me. NO ONE is for child abuse (except those abusers whom should be punished accordingly) but the presumption of something I purchased is to work for me, for my benefit. not for apples marketing, not for child safety, not for advertising, not for social causes (unless that is its purpose of course) for me!

that's fine Apple. I am simply no longer a part of the revenue stream. I am not trashing my devices, but not updating them either, and won't be buying new ones. We will soon see how long and far that goes. no new iTunes purchases, no iCloud, no iMessage, no ios15, etc.

that being said. I will miss old apple. they were the best for the time. felt very happy to be a customer. they may still be the best of the worst. but that's far from what they were.

I will diversify, I will explore alternatives. we will see.
 
I don’t understand your logic. Apple has not released the code yet, and security researchers haven’t been able to pick it apart. Apple has been open and honest about what their plans were, yet they lost your trust?

They lost your trust long before this I reckon.
Well, at least, a previous version of the code was found in iOS 14.3 and can be seen here: https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX

I know, it is not the version Apple will use (at least that is what they claim), but this version already shows the fundamental problems of such an algorithm.

I own a Macbook, iPhone, Apple Watch, Apple TV, iPad Pro 2021 and purchased a lot on the different stores and after already having bought the new iPad Pro for > 2000 dollars, I was about to purchase a new Macbook Pro, a new iPhone and a new Apple Watch this year, but I will not. There is no need for me to spend like 6000 dollars on those products if I can get other devices which do not take my privacy seriously either for less than the half.

Apple has to regain the trust, by publicly acknowledging that it was a bad idea, that the project is completely stopped and that it has removed all traces of the code from all systems.
 
Well, at least, a previous version of the code was found in iOS 14.3 and can be seen here: https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX

I know, it is not the version Apple will use (at least that is what they claim), but this version already shows the fundamental problems of such an algorithm.

I own a Macbook, iPhone, Apple Watch, Apple TV, iPad Pro 2021 and purchased a lot on the different stores and after already having bought the new iPad Pro for > 2000 dollars, I was about to purchase a new Macbook Pro, a new iPhone and a new Apple Watch this year, but I will not. There is no need for me to spend like 6000 dollars on those products if I can get other devices which do not take my privacy seriously either for less than the half.

Apple has to regain the trust, by publicly acknowledging that it was a bad idea, that the project is completely stopped and that it has removed all traces of the code from all systems.
What's the problem with the algorithm, the fact that you can force a collision if you know the original file's NeuroHash? That's not really a problem. Each false positive that is found, goes through another system hosted solely on Apple's server that checks it against a different hash that is not known to anyone else, so even if you did "trick the system" with a "forced collision", it wouldn't make it through the second server-side hashing process, so they STILL wouldn't go to human review. Nobody will see your photos unless you have REAL CSAM.

If you're still scared after all of that, then I don't know how to help you.
 
What's the problem with the algorithm, the fact that you can force a collision if you know the original file's NeuroHash? That's not really a problem. Each false positive that is found, goes through another system hosted solely on Apple's server that checks it against a different hash that is not known to anyone else, so even if you did "trick the system" with a "forced collision", it wouldn't make it through the second server-side hashing process, so they STILL wouldn't go to human review. Nobody will see your photos unless you have REAL CSAM.

If you're still scared after all of that, then I don't know how to help you.
It's not even that much of a problem. The CSAM hashes on the device are encrypted and unreadable by iOS, so somebody trying to engineer a hash collision wouldn't have access to any hashes to try to match.
 
What's the problem with the algorithm, the fact that you can force a collision if you know the original file's NeuroHash? That's not really a problem. Each false positive that is found, goes through another system hosted solely on Apple's server that checks it against a different hash that is not known to anyone else, so even if you did "trick the system" with a "forced collision", it wouldn't make it through the second server-side hashing process, so they STILL wouldn't go to human review. Nobody will see your photos unless you have REAL CSAM.

If you're still scared after all of that, then I don't know how to help you.
This is even more problematic in my opinion. The problem I see is that images that are false positives leave my phone without my consent. I want to know where my files go and I do not want my phone to send the files somewhere in the background neither for another hash check nor for a human review. This has nothing to do with being scared.
 
  • Like
Reactions: dk001 and Pummers
The problem I see is that images that are false positives leave my phone without my consent.
If you use iCloud Photo, your image leaves your phone, regardless of whether there’s a CSAM hash done or not. If you don’t use iCloud Photo, your image stays in your phone and no CSAM hash is done if your phone is running iOS 15.
 
If you use iCloud Photo, your image leaves your phone, regardless of whether there’s a CSAM hash done or not. If you don’t use iCloud Photo, your image stays in your phone and no CSAM hash is done if your phone is running iOS 15.
If I use iCloud Photo I consciously chose to send my photos to the iCloud Photo server and only to the iCloud Photo server for one specific purpose of which I am aware. I would even be ok with Apple scanning the files on their server.

But I am definitely not ok with images leaving my phone to some unknown server where they are used for something I do not know and did not agree to. I can’t understand how people can defend this and compare it with using iCloud Photo. It is a completely different story.
 
  • Like
Reactions: BurgDog
But I am definitely not ok with images leaving my phone to some unknown server where they are used for something I do not know and did not agree to. I can’t understand how people can defend this and compare it with using iCloud Photo. It is a completely different story.
Not sure if you understand how Apple is implementing the entire widget.

There will be no hash generated (which many on the forums termed it as 'scanning', which as far as I understand it, is not a scan of photos), and thus no security voucher generated, if there's no upload to iCloud Photo. Thus nothing leaves the phone.

The hash is only done for photo(s) that is/are about to be uploaded to iCloud Photo. If iCloud Photo is not enabled, no photo will be hashed and no security vouchers will be sent (security vouchers are only generated for matched hashes, else as I understand it, no security vouchers will be generated.) If you trust what Apple is saying, there's all to it.

And if you are already sending photos to iCloud Photo, your photos are leaving your device and stored in Apple's servers, regardless of whether there's any hash/security vouchers being done or generated. That's what I'm trying to explain earlier. For for all intent and purposes, there's no difference if the hash is done on device or in the iCloud servers.

Further, I believe Apple intend to go fully E2EE for iCloud Photo in the future and this hash implementation will enable that. Otherwise there's no way for Apple to implement E2EE and comply to reporting CSAM materials stored in their servers.

I'm just sad that many folks mis-understand stuffs. We can don't trust what Apple is saying, and that's fine. We come to conclusions based on our believes and biases and that's fine too. But posting something that's not accurate is not right.
 
Not sure if you understand how Apple is implementing the entire widget.

There will be no hash generated (which many on the forums termed it as 'scanning', which as far as I understand it, is not a scan of photos), and thus no security voucher generated, if there's no upload to iCloud Photo. Thus nothing leaves the phone.
Doubters will insist there no way to know the voucher isn’t generated. While they are wrong about that, the important thing to note is that even if it is, and the ‘scan’ is performed, that ‘scan’ is effectively disabled because the positive/negative result can’t be learned on the phone. Only iCloud can divine the result. It’s like a blood test. The results aren’t learned in the needle, they are learned at the lab.
The hash is only done for photo(s) that is/are about to be uploaded to iCloud Photo. If iCloud Photo is not enabled, no photo will be hashed and no security vouchers will be sent (security vouchers are only generated for matched hashes, else as I understand it, no security vouchers will be generated.) If you trust what Apple is saying, there's all to it.
Security vouchers are generated for every image being uploaded to iCloud. An essential part of the system is that no process can ever know if there’s a match until the threshold is met on the server, so every uploaded image is treated the same way.
And if you are already sending photos to iCloud Photo, your photos are leaving your device and stored in Apple's servers, regardless of whether there's any hash/security vouchers being done or generated. That's what I'm trying to explain earlier. For for all intent and purposes, there's no difference if the hash is done on device or in the iCloud servers.

Further, I believe Apple intend to go fully E2EE for iCloud Photo in the future and this hash implementation will enable that. Otherwise there's no way for Apple to implement E2EE and comply to reporting CSAM materials stored in their servers.
Apple are not obliged to search for CSAM, only to report it if they find it. I believe Apple’s intention is only to prevent a future E2EE product facilitating predators in their crimes (which is a very responsible position IMO) but you can’t block CSAM without finding CSAM and you can’t find CSAM and then not report it.
I'm just sad that many folks mis-understand stuffs. We can don't trust what Apple is saying, and that's fine. We come to conclusions based on our believes and biases and that's fine too. But posting something that's not accurate is not right.
Apple have been too clever for their own good with this mathematical ethics stuff. We saw hints of it with their Differential Privacy. It’s complicated. People won’t get it on the first read and if you don’t get the messaging right, most people won’t give it a second read.
 
  • Disagree
  • Like
Reactions: dk001 and quarkysg
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.