Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
you do not know that.
it's in the technical document.

NeuralHash NeuralHash is a perceptual hashing function that maps images to numbers. Perceptual hashing bases this number on features of the image instead of the precise values of pixels in the image. The system computes these hashes by using an embedding network to produce image descriptors and then converting those descriptors to integers using a Hyperplane LSH (Locality Sensitivity Hashing) process. This process ensures that different images produce different hashes. The embedding network represents images as real-valued vectors and ensures that perceptually and semantically similar images have close descriptors in the sense of angular distance or cosine similarity. Perceptually and semantically different images have descriptors farther apart, which results in larger angular distances. The Hyperplane LSH process then converts descriptors to unique hash values as integers. For all images processed by the above system, regardless of resolution and quality, each image must have a unique hash for the content of the image. This hash must be significantly smaller than the image to be sufficiently efficient when stored on disk or sent over the network. The main purpose of the hash is to ensure that identical and visually similar images result in the same hash, and images that are different from one another result in different hashes. For example, an image that has been slightly cropped or resized should be considered identical to its original and have the same hash

Visually similar images meaning images that were modified but are of the same content. They gave an example of an RGB photo converted to a black and white photo. A neural hash would result in a match. However 2 photos taken of the same object at different angles would result in different content and therefore result in a different neural hash.
 
What a turnaround this is.

For years, people have sought ways to de-Google their Android device due to invasive behavior by Google.

Now it looks like people are starting to wonder how to de-Apple their Apple device because of invasive behavior by Apple!
I expect that Google and Samsung will jump on Apple's about face on privacy immediately in their ads. This is golden for them. Completely undermines Apples privacy credibility and they can profit from that. No longer can Apple claim the privacy high ground. Enough confusion about Apple's message that simple ads and a bit of spin can easily take advantage of. Samsung will do this even if they later plan on doing similar - it is the Samsung way.
 
Please educate yourself on the subject before making ill-informed comments.

It requires 30 matches for them to even begin to investigate. 30 matches = you undoubtedly have child porn on your phone as 30 false positives would be virtually impossible.
I think your seriously missing the point of the slippery slope being created here, these tools become bad overnight the next attack that cause other Patriot type acts to be imposed…. And don’t think a hacker can’t load up your iCloud with anything they want and then just sit back and wait for them to come arrest you
 
  • Like
Reactions: Playfoot
Apple also scanned 100 000 million images of a general nature which resulted in 3 false positives.

It's clearly indicative of the probability any one account should have on the order of 30 such false positives are extremely low. It also shows that having one false positive doesn't really increase the probability of having several other false positives to reach the threshold. If that's where the case you should have seen clusters and much larger number of false positives.
I can't confidently say the probabilities are independent. What if you have one original image triggering the false positive then several slightly edited versions? Or more likely, several other shots that are from the same angle.
 
I expect that Google and Samsung will jump on Apple's about face on privacy immediately in their ads. This is golden for them. Completely undermines Apples privacy credibility and they can profit from that. No longer can Apple claim the privacy high ground. Enough confusion about Apple's message that simple ads and a bit of spin can easily take advantage of. Samsung will do this even if they later plan on doing similar - it is the Samsung way.
It would actually be smarter for Google to say they would never do this with android and portray themselves as the guardian of privacy….. not that anyone would believe Google but it’s still a golden moment for them in the war…. Note the manufacturers could stay out of it… android is Google after all

I can't confidently say the probabilities are independent. What if you have one original image triggering the false positive then several slightly edited versions? Or more likely, several other shots that are from the same angle.
Like I said before… doesn’t matter what the error rate is if your the one that gets nailed for something you didn’t do

they can add all the dirty photos they want and none will be a match unless the photos exist in the csam database, and the hashes match ... how would they do this ?

this protocol by apple doesn't suddenly make it easier to add photos to people phones

you have the ability to look at your photos so you would see anything added, if your phone has a good password no on can get in

we could extend the paranoia of this argument about stuff added to your phone endlessly right ?
Most do not practice good security, weak passwords etc, your right that it does not make it easier, it’s just as easy as it’s always been, hard for me, easy for people who are in to it…. And finally if someone has decided to take you down I’m sure they know exactly what they would need to upload to do it. You might notice new photos on your phone a week from now, some may never notice, not everyone looks at there camera roll often
 
I can't confidently say the probabilities are independent. What if you have one original image triggering the false positive then several slightly edited versions? Or more likely, several other shots that are from the same angle.
apple has stated emphatically that the hashes derived from the csam material does not change if the original material is altered, they must use some kind of hash of key parts/highlight of the image
 
apple has stated emphatically that the hashes derived from the csam material does not change if the original material is altered, they must use some kind of hash of key parts/highlight of the image
I think the trust in what Apple said before this is different from the trust afterwards, they would have to do a full reversal but even then the trust would not be the same. I honestly did think Apple would never do anything like this, they screwed the pooch….. spent years almost bragging about privacy as a reason to be in the Apple ecosystem
 
I think the trust in what Apple said before this is different from the trust afterwards, they would have to do a full reversal but even then the trust would not be the same. I honestly did think Apple would never do anything like this, they screwed the pooch….. spent years almost bragging about privacy as a reason to be in the Apple ecosystem
no doubt, apple has permanently damaged their brand with this, even if they now come out with e2ee they will be damaged

google is loving every freaking moment of this debacle
 
Cool. You trust virologists and epidemiologists, and you clearly want “experts” to provide you with the best information.

I gave you a very simple method to verify the quality of information from these experts you mentioned.

You are an EE, so I assume you can perform an electrical rating, yes? You would do this the appropriate oscilliscope or analyzer, yes? What’s the max bandwidth of your best tool? How precise are your measurements outside that range? Is there any kind of rolloff in your electrical expertise past the limitations of your observational abilities?

Would it be unprofessional for you to give an absolute measurement, based on guesses outside the range your tools can detect? At least without some qualification of uncertainty? Would it be ethical for you to “ballpark” it, and then make absolute declarations for a device’s output rating, beyond what you could know for certain? How do you measure it’s efficiency? With a percentage, yes? Why?

I’m not going to give you easy answers. It might take you a little time to skim around a few TEM and STEM articles and forums. But this way, the journey of discovery is all your own, bypassing any personality preconceptions, forum rhetorical rituals or any pesky partisan coconut boundaries, if applicable. You must know a few things if you’re an EE, so you can relatively easily drive yourself right upside this knowledge.

Your only breadcrumb: Why would expert virologists be writing papers to propose ideas and techniques for improving STEM and TEM viral results? Why would an expert ever want to upgrade from a “TEM Pro Max 12” to the “TEM Pro Max 13”? New form factor? Titanium case? Higher rez? What are they missing with the 12?

No thanks, I'll pass. If you had some credentials regarding infectious disease/epidemiology research to lay out, I might be interested in investigating further. Without that, I have no idea if your assessment questioning measurements is even valid to begin with. Even if I had the time, I don't have the background to make or judge that assessment.

It would be like me making generalizations regarding analog superheterodyne radios being better or worse than digital radios; ie digitally sampling an analog IF and then using digital downconversion and filtering techniques.

I wouldn't expect you to believe my assessment, or even related measurements (phase noise, noise figure, sensitivity, spur free dynamic range, strong signal third order intermodulation, filter shape factor and ultimate rejection, and on and on) and the equipment used to measure those parameters being adequate for the task, unless I was able to demonstrate competence in digital radio design.

Getting back on track, with respect to infectious diseases and pandemics, I'll rely on those who have many decades of experience and a demonstrated track record dealing with them. Simple.

If you want to call them out because you believe measurements are faulty, feel free. I suspect most people will simply ignore your assertions.
 
Last edited:
  • Haha
Reactions: Frustratedperson
i have read that this may be a prelude to e2ee from several influential folks, like this from john gruber

Which in turn makes me wonder if Apple sees this initiative as a necessary first step toward providing end-to-end encryption for iCloud Photo Library and iCloud device backups.

plus, it makes sense, apple has a legal responsibility to report child abuse and if they encrypt end-to-end the justice department may well use that as grounds to file a lawsuit against them on the grounds that they are shirking their responsibility to look out for csam, which may break apple's ability to provide any encryption going forward

this is apple's response to say "we are actively looking out for these materials and also protecting our users privacy with e2ee"

Excellent point! Could be!
 
I think your seriously missing the point of the slippery slope being created here, these tools become bad overnight the next attack that cause other Patriot type acts to be imposed…. And don’t think a hacker can’t load up your iCloud with anything they want and then just sit back and wait for them to come arrest you
Nice try, but that has nothing to do with the fact that the comment you made and I replied to was false. No one's life is going to be ruined by one bad match and I can't imagine that some random stranger is going to upload child porn to my iCloud account for kicks. I don't wear a tin-foil hat. That's paranoid fear mongering at its very worst. :rolleyes:
 
Last edited:
  • Haha
Reactions: Frustratedperson
Nice try, but that has nothing to do with the fact that the comment you made and I replied to was false. No one's life is going to be ruined by one bad match and I can't imagine that some random stranger is going to upload child porn to my iCloud account for kicks. I don't wear a tin-foil hat. That's paranoid fear mongering at its very worst. :rolleyes:
Famous last words
 
don't be so sure, apple may be doing this on-device scanning as a way to keep authorities happy when they roll out e2e, it will serve to silence legislators who predictably scream "what about the kiddie porn!"
What good is a End end encryption up on a server if you can’t get information to and from without being searched?

obviously from my previous posts, I am fully against this move. But what it seems like you are saying is that e2e means something on the cloud if you are searched getting there and coming back. Does it?
 
What good is a End end encryption up on a server if you can’t get information to and from without being searched?

obviously from my previous posts, I am fully against this move. But what it seems like you are saying is that e2e means something on the cloud if you are searched getting there and coming back. Does it?
from what i have read i would outline it this way:

apple wants to do e2ee on everything, but, they have a legal responsibility to report child porn/csam

so apple is doing this as the best way to please both government and users, and make no mistake, it won't take much for congress to pass a bill outlawing e2ee or demanding more proof of surveillance for csam

from what i read apple is being transparent as they can by making the database visible and providing a hash for it and using the confluence of 2 databses of csam to prevent bad actors from inserting non-csam images, by so doing they can offer e2ee which, though it requires apple to do hashing of our photos, it still gives e2ee on all our stuff which protects us against other folks intrusion

yeah, we have to trust apple on this completely, will we ? it remains to be seen
 
  • Like
Reactions: citysnaps
from what i have read i would outline it this way:

apple wants to do e2ee on everything, but, they have a legal responsibility to report child porn/csam

so apple is doing this as the best way to please both government and users, and make no mistake, it won't take much for congress to pass a bill outlawing e2ee or demanding more proof of surveillance for csam

from what i read apple is being transparent as they can by making the database visible and providing a hash for it and using the confluence of 2 databses of csam to prevent bad actors from inserting non-csam images, by so doing they can offer e2ee which, though it requires apple to do hashing of our photos, it still gives e2ee on all our stuff which protects us against other folks intrusion

yeah, we have to trust apple on this completely, will we ? it remains to be seen
I’ll buy that. Makes a lot if sense.
 
After thinking about this today I think the only solution for my family is to just go ahead and terminate icloud, probably should freeze the phones at iOS 14 also … if ios15 is going to be searching even with icloud turned off then that would terminate Apple ….allowing tech to do what the government can’t is just a bridge too far
 
Apple stop blowing smoke and revert accordingly.
What I still can't understand is why Apple is trying to implement this in the first place since they're all about "privacy"... I have a feeling it's about money...
 
from what i have read i would outline it this way:

apple wants to do e2ee on everything, but, they have a legal responsibility to report child porn/csam

so apple is doing this as the best way to please both government and users, and make no mistake, it won't take much for congress to pass a bill outlawing e2ee or demanding more proof of surveillance for csam

from what i read apple is being transparent as they can by making the database visible and providing a hash for it and using the confluence of 2 databses of csam to prevent bad actors from inserting non-csam images, by so doing they can offer e2ee which, though it requires apple to do hashing of our photos, it still gives e2ee on all our stuff which protects us against other folks intrusion

yeah, we have to trust apple on this completely, will we ? it remains to be seen
I don't want e2ee if that's what it costs.
 
I just don't want Apple to be scanning iCloud period. It's a way to look over and go through our privacy. What if information gets leak to the government or the criminals. Who's held responsible for that?

Find an alternative way to catch criminals. And, why Apple is even getting involve?
Absolutely. It's not like nothing is being done on this issue. Law enforcement gets ten of billions each year to go after criminals and the issue has well-funded, high-profile advocacy groups working constantly on it. Why does Apple have to take these kinds of intrusive steps which places tens of millions of innocent users constantly under the spotlight as suspects? Why not give a big donation and be done with it? This is why many key experts think that there is something more to this, that it is laying the groundwork for something China or LE wants badly. Steve kept USERS first. Will the current Apple? I guess we know. (Edited for typos)
 
I don't want e2ee if that's what it costs.
but you are then trading one exposure for another, unencrypted data in the cloud open to bad actors ... or ... trust apple to do what they say they are going to do, and let's face they are trying to be on record of how seriously they will take all this

for me, on balance, i am leaning toward trusting apple as the lesser of 2 evils

lets not forget that the us congress can turn on a dime and outlaw e2ee and then we get nothing at all whereas now apple (on our behalf) has a legitimate point to make in defence of e2ee, they can say we have a sophisticated plan in place to find these materials, a law isn't necessary
 
Hashes and checksums are as old as time.
Actually perceptual image hashes are not very old.

There’s nothing new or "impressive" here that they've developed. Any developer is capable of doing this and this method is in fact used all the time to make sure files didn’t get modified or corrupted during upload or download.
That's not the kind of hash they are using. Anyway, try reading (and understanding) the cooperative client/server protocol they are using (private set intersection):


I still think the whole thing is misguided, but there is no question that the protocols they developed are very interesting.
 
but you are then trading one exposure for another, unencrypted data in the cloud open to bad actors ... or ... trust apple to do what they say they are going to do, and let's face they are trying to be on record of how seriously they will take all this
No, I'm trading unencrypted for unencrypted. A bypass of the encryption makes the encryption irrelevant. They lost any trust I had when they announced the on device scanning and nothing else to benefit me. (like e2ee)
 
When you see how many fronts Apple are under pressure from governments and the EU throughout the world over APPs etc., and the pressure from these people over having a backdoor for anti terrorist/crime fighting etc., that Apple have apparently rejected. You can't blame Apple users for now wondering whether there's a deal been struck for that back door, which would explain Apple's intransigence in having the tools to do that on over a billion+ devices.

Even the argument Apple use makes no sense, inasmuch as they suggest it protect privacy because other cloud organisations check more than hash information, but that argument is vacuous, as on that basis they could use the same NeuralHash system on iCloud, but seem to be trying to defend the indefensible by having a billion plus users having it embedded on their own hardware, which is exactly what some governments and agencies dream of, a backdoor to individual hardware.

With much of the clamour to get Apple to assist with these features, I think many people would be forgiven for coming to a conclusion that perhaps a deal has been struck, whereby the threat of actions against Apple on various fronts is minimised and in the case of some nations, its a demand to allow sales in those countries at all.

Very dark days for Apple, the PR has been awful, and although Apple has made a few hashes (excuse the pun) in the past, this seems to be potentially the worst, and Apple's attempts to defend the indefensible seems to dig a deeper hole each time, as each excuse still doesn't prevent them doing what they SAY the are doing in preventing child abuse, etc., but using the same system on the cloud NOT on users hardware. If NeuralHash gives more privacy as Apple says, there is nothing that makes it impossible in doing that on THEIR servers. Putting it on users hardware tells its own story.
 
Actually perceptual image hashes are not very old.


That's not the kind of hash they are using. Anyway, try reading (and understanding) the cooperative client/server protocol they are using (private set intersection):


I still think the whole thing is misguided, but there is no question that the protocols they developed are very interesting.
Yes interesting but they have no reason for this to be used at the user end on users's hardware which is surveillance. If they so wish use it on their own servers, and kicking against that sends out signals that its nothing to do with child safety and everything to do with the forerunner to a backdoor.

They can't even suggests it conveys more privacy as that's playing words, because they suggest NeuralHash is safer for privacy as it only checks pictures, whereas other company may check content.

However this argument falls flat on its face because the same NeuralHash could work via the server, so there's no reason at all for it to on the hardware of users....unless they have other plans.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.