Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Where are you, Craig? You said this yourself in WWDC - 2021. What happened?

“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”
Please explain how your right to privacy has anything to do with you having an exact copy of an illegal photo that someone else created. A photo of which the FBI also has a copy. A photo which, by uploading to iCloud, you are illegally distributing.
 
You can guarantee the government has more "naughty file hash databases" than just CSAM.

I'm sure there's the "persons of interest hash database", the "insurgent file hash database"...

But we NEED to get those January 6th insurgents!
 
  • Like
Reactions: GBaughma
Please explain how your right to privacy has anything to do with you having an exact copy of an illegal photo that someone else created. A photo of which the FBI also has a copy. A photo which, by uploading to iCloud, you are illegally distributing.
Because we don’t believe that this stops at “illegal photos”

Why do you think it will stop there?

Trump admin could had used this to go after protesters in Seattle and Portland and shared images of how to deal with jackboot cops and how to contact legal aid, or far left block imagery.

Biden admin could use this to go after anyone who was even talking about going to the protest on 1/6 in DC (not the resulting capitol riot) by flagging images of maps into protest areas or Cringy right wing boomer memes.
 
I think this only works if you have iCloud switched on btw.
So essentially you could just use another backup service for your photos (DropBox, Google etc..) and just not use the iCloud service.
Is that not good enough for you or your worried about the principal of it all?

(Also, remember that whatever backdoors are in software these companies by law will never tell you. Snowden demonstrated that a while back. All of the big companies were signed up).

The only way to be truly secure is to run an open source OS (e.g. linux) where you can view every line of code (or someone has done it) and then set up your own private backup services etc.. A bit of effort but worthwhile I suppose if you feel strongly about this stuff.
I’d probably be better explaining things…
For as long as I remember, I was PC and then when phones came around, Android. I loved it but hated the lack of privacy. I moved to Apple years ago and always felt it was a more honest, secure company and products, which it has been up to now. They are much more expensive but it’s been worth it. Now it’s not. I didn’t trust the others but always felt Apple had my back. Now I trust no company as Apple, in my opinion, has been fake over this human right for privacy marketing rubbish. Now that I trust nobody, it’s cheaper, easier, far more choice, far less restriction to go back to Android and PC and have 100x the choice on products. As I trust no company now, I will be putting my massive photo library (50k pics/videos and counting) of my kids, holidays, private stuff offline. If I want to share stuff then I’ll manually take it from the offline drive to the computer. I’m leaving Apple products out of principle. To add, it isn’t just this that has tipped me over, they talk about privacy yet I sold a MacBook a while ago and forgot to turn off the find my. I contacted the chap and told him I could see his location, everything. He wasn’t happy, I wasn’t happy but Apple wouldn’t help. I told them I could prove (with serial number) that I bought it new and also that I sold it to this chap (clearly showing the serial number also). Apple won’t take my find my off, the lad can’t without my password, I can’t travel to him as he lives 6hrs away, he can’t as the same. This is just ridiculous.
 
Those other services are already presumed to be doing evil things.

We expect Apple to hold a higher standard. And we are taking them up on it. They are getting resistance on this in the mainstream, not just here and other Apple-fan venues.
I watched a documentary on what is happening with CSAM. And its a losing battle. The encryption and obfuscation technology that these disgusting people can use is just too powerful. I feel like Apple is doing this so other companies can adopt it too and then there will be less places to hide maybe?

We do have to weigh up individual privacy with the human rights of children here right?

I actually think that from a marketing point of view this will be a win for Apple vs other phones and I think other companies will be forced to implement something similar.

My only caveat is that if it only works with iCloud switched on, why does the search have to happen on your device instead of the cloud? I dont get that. iCloud implies a mirror image of your photos in the cloud. What am I missing here?
 
  • Like
Reactions: BurgDog
Just for the record Tim Cook is a relative newcomer to Apple compared to me.

My involvement with Apple predates his by TWENTY TWO years.

Encouraged several major companies to stock and use Apple equipment, and as Apple will know (and the UK authorities could confirm) I thwarted the first Ebay fraud involving fraudsters selling non existent Apple equipment in 1999.

A very long and happy association with Apple, and I do hope Apple listen to the concerns and step back from the abyss!
 
if you researched the technology, you’d understand that they are not flagging pictures….

Here is a question for you. Can they flag any “pic” that matches the index database?
Well, they are using both actually. They're using CSAM to look for those pictures, AND they're going to be implementing AI photo recognition for people under 18 sending nudes. This article has only been about the CSAM portion, BUT there's also the "Family plan" stuff, where if a person under 18 sends what Apple considers a nude, it asks them if they REALLY want to do it, and if they say yes, it sends a copy to their parents.
 
No. We want to use iCloud. We are holding Apple responsible for this BS. We expect them to be better.

At least I don’t believe that Apple’s interest in this is organic, and that they are being pressured to implement this infrastructure. Otherwise, why would the “Government naughty file hash database” exist?
What anyone believes is immaterial.

The reality involves scanning your phones for unknown material (have we been told the truth, the whole truth and only the truth) by some built-in AI-algortihm og completely unknown quality (is its racially biased or... or... ?) in the Phone you use. All phone contents is available for scanning, and if the scan deems any result close to officially provided content (by governments), it is flagged as suspicious. And you're in deep do-do!

In some countries, you'll be in severe danger during "mild interrogation", and if you're lucky you'll die of a heart attack within a few weeks. You've won the jack-pot, if you're so lucky it happens within hours). That's a worse case scenario, but those are "normal procedure" in many countries.

If I was black, I would be scared stiff, if I was hauled in for questioning as a result of a false positive - or plain bug - in Apples programming by any US police officer (especially if I was innocent - and like to breathe)!

Regards
 
I switched from android about 14 months ago. This is exactly the kind of stuff I expect from google. So disappointing to find out Apple's privacy mantra was just marketing words. IMO android is superior to IOS in many ways so if the Iphone loses the privacy edge over android I might as well go back. Pixel here I come.
 
I switched from android about 14 months ago. This is exactly the kind of stuff I expect from google. So disappointing to find out Apple's privacy mantra was just marketing words. IMO android is superior to IOS in many ways so if the Iphone loses the privacy edge over android I might as well go back. Pixel here I come.
When you move away from Apple, you shouldn't use a normal Android with its awful Playstore (= Trojan horse), but something like https://grapheneos.org without Playstore.
 
What anyone believes is immaterial.

The reality involves scanning your phones for unknown material (have we been told the truth, the whole truth and only the truth) by some built-in AI-algortihm og completely unknown quality (is its racially biased or... or... ?) in the Phone you use. All phone contents is available for scanning, and if the scan deems any result close to officially provided content (by governments), it is flagged as suspicious. And you're in deep do-do!

In some countries, you'll be in severe danger during "mild interrogation", and if you're lucky you'll die of a heart attack within a few weeks. You've won the jack-pot, if you're so lucky it happens within hours). That's a worse case scenario, but those are "normal procedure" in many countries.

If I was black, I would be scared stiff, if I was hauled in for questioning as a result of a false positive - or plain bug - in Apples programming by any US police officer (especially if I was innocent - and like to breathe)!

Regards
Meh

if the police just want to be racist they don’t need this level of technology to Murder black people. This is a higher-level form of control, and it’s colorblind IMO.

But point taken on questioning in other countries
 
  • Like
Reactions: Robert.Walter
I don’t get the outrage. Apple is scanning known hash values. That means they already have the photo they are scanning for, they are only looking for others who have it. They are also only scanning iCloud photos. So if you’re concerned your photo library has something in it that could be flagged, just disable iCloud photos.
In past years, I had two iCloud annual GM updates that turned iCloud photos on.

Just this week iOS PB5 turned on iCloud alias addresses in FaceTime and Messages which I had previously turned off.
 
I’d probably be better explaining things…
For as long as I remember, I was PC and then when phones came around, Android. I loved it but hated the lack of privacy. I moved to Apple years ago and always felt it was a more honest, secure company and products, which it has been up to now. They are much more expensive but it’s been worth it. Now it’s not. I didn’t trust the others but always felt Apple had my back. Now I trust no company as Apple, in my opinion, has been fake over this human right for privacy marketing rubbish. Now that I twist nobody, it’s cheaper, easier, far more choice, far less restriction to go back to Android and PC and have 100x the choice on products. As I trust no company now, I will be putting my massive photo library (50k pics/videos and counting) of my kids, holidays, private stuff offline. If I want to share stuff then I’ll manually take it from the offline drive to the computer. I’m leaving Apple products out of principle. To add, it isn’t just this that has tipped me over, they talk about privacy yet I sold a MacBook a while ago and forgot to turn off the find my. I contacted the chap and told him I could see his location, everything. He wasn’t happy, I wasn’t happy but Apple wouldn’t help. I told them I could prove (with serial number) that I bought it new and also that I sold it to this chap (clearly showing the serial number also). Apple won’t take my find my off, the lad can’t without my password, I can’t travel to him as he lives 6hrs away, he can’t as the same. This is just ridiculous.
Interesting...

I get it. Apple removed the word "computer" from their name a while ago. I feel like Apple sell "appliances" not necessarily private / personal computer systems. They are very "opinionated" design wise and are more interested in consumers who dont need or do detail (look at the way settings is structured compared to Windows for example, its all about hiding complexity).

It really isn't the company to rest your life on if your believe in the type of principals you do. As Apple will always design for the majority of their core customers needs. i.e. no pornography etc... It appeals to particular demographic who would rather that choice is restricted in order for them to live in a "utopia" (hence the walled garden and restrictions).

Apple are marketing privacy from other capitalist businesses (FB, Walmart etc) but not from law enforcement. That is actually pretty core with them. They know far more people will be against CSAM in their core demographic then what issues could arise from this process they want to implement.

I still dont understand why the scan has to happen on the device rather than on the uploaded iCloud images in the cloud. If you have iCloud on then that means you want to back up your photos to the cloud. So why scan them on the device? Thats not making sense to me.

The one thing I will say in Apples defence is that at least they are telling us! And there is a white paper etc.. Other companies are not as candid at all. Sometimes I'd rather trust in a company that operates in that manner even if I disagree with some of what they do. Is the ones that tell you nothing I'd be more worried about.
 
This is exactly the problem. Apple insists they will refuse to do this and I wish them luck but I don’t see how they can if countries pass laws requiring them to look for other things.

This pretty much says it all:

Apple saying they will refuse when government's request them to expand the scope of this surveillance software is laughable, since the government will legally order them to comply with their request and their will be legal consequences if they refuse.
 
When you move away from Apple, you shouldn't use a normal Android with its awful Playstore (= Trojan horse), but something like https://grapheneos.org without Playstore.
While I applaud these types of privacy efforts, I have to assume at this point that they are (or will become ) compromised.

we really need a tech giant like Apple in our corner to punch / push back with the interests of their customers in mind.
 
I watched a documentary on what is happening with CSAM. And its a losing battle. The encryption and obfuscation technology that these disgusting people can use is just too powerful. I feel like Apple is doing this so other companies can adopt it too and then there will be less places to hide maybe?

We do have to weigh up individual privacy with the human rights of children here right?

I actually think that from a marketing point of view this will be a win for Apple vs other phones and I think other companies will be forced to implement something similar.

My only caveat is that if it only works with iCloud switched on, why does the search have to happen on your device instead of the cloud? I dont get that. iCloud implies a mirror image of your photos in the cloud. What am I missing here?
Don't get fooled like so many on these bb's. It is nothing to do with protecting children. It does not and will not. If anything it will put more children at risk simply because those who engage in such disgusting 'pursuits' will merely go underground, dark web via Tor, VPN's or heavily encrypted material and that will make the job of agencies employed in charging these people much more difficult.

Apple do NOT have access to all hash's re: abused children, they are merely using ONE NCMEC database that contains data on missing children and exploited children, but where that provides the tip of the iceberg.

NCMEC do great work, no argument, but they have sufficient data on their website to allow ANYONE to send their concerns, or pictures and I urge anyone with concerns to go to that site and send them data if they have concerns.

However their database is literally a drop in the ocean, and police and authorities data both in the UK and the USA are not given to companies, not even Apple, so the idea this is to help safeguard children is simply flawed, NCMEC try, and good luck to them, but many government agencies will now be impeded if Apple go ahead with this, because the culprits will go underground, making children LESS SAFE, and where then its best to communicate direct with either the authorities or NCMEC as their website suggests.

But even with all of that, this is NOT about safeguarding children, it is SURVEILLANCE in the name of safeguarding children.

I was surprised yesterday to find an article suggesting that the software to do this will be embedded in IOS and OS for all Mac devices, and where its technically incorrect to say just switch off iCloud pictures, because the systems software does a recheck before such pictures are even transmitted to iCloud.

That means everyone's Hardware that they have paid for will be used, processing power reduced, and even the cost of electricity born by the USER.

Once this happens, whatever a company says about 'being designed not to do this or that' becomes irrelevant, and where Apple now has demonstrated its stance on privacy and being against surveillance so publicly aired cannot be trusted, because surveillance IS what is going to happen.

Can you imagine companies with multiple Macs whose livelihood depends on privacy, now trusting Apple kit to be used within their organisation? I won't be able to. No publishing house would be able to because a precedent would have been set that your hardware that you bought off Apple in good faith will now be compromised by surveillance, surveillance in this case in the name of child abuse which IT DOES NOT PROTECT AGAINST, but makes it harder for the authorities to pursue those responsible, but in any event whatever way Apple try to dress this up. IT IS SURVEILLANCE and pernicious, and its downhill from then on.

Could you imagine those who think its a good idea because its stated to be about safeguarding children, being in favour if it was sold as SURVEILLANCE OF YOUR SYSTEMS. of course not, which is why this idea is so pernicious.
 
Yikes. Reading the Technical Summary yields masterpieces of double-talk such as:

The hashing technology, called NeuralHash, analyzes an image and converts it to a unique number specific to that image. Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value.
"Nearly identical" doesn't mean "identical". If an image that is only nearly identical can generate the same number, then the number isn't "a unique number specific to that image". If you've encountered hashes as a way of verifying the authenticity of downloads or while reading about blockchain, that's not what is happening here. OK, they're talking about images that differ in size and quality so maybe you could call that "nearly identical" but that "nearly" makes a huge difference in the likelihood of a false match.

The threshold is selected to provide an extremely low (1 in 1 trillion) probability of incorrectly flagging a given account.
OK, so let's just trust that Apple have read about Sally Clark and understand the difference between independent events (tossing a fair coin) and possibly correlated events (e.g. if one of your photos triggers a false match, how likely is it that there will be other "nearly identical" photos in your collection?) and haven't just multiplied the probability of a hash collision by the number of matches (... which would work but for that pesky "nearly").

This is further mitigated by a manual review process wherein Apple reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC
Which is not the same as "reviews each report to confirm that the match really has found CSAM and, if so, disables the account and sends a report ti NCMEC". If that's what they mean, why not say it clearly?

NeuralHash is a perceptual hashing function that maps images to numbers. Perceptual hashing bases this number on features of the image instead of the precise values of pixels in the image. The system computes these hashes by using an embedding network to produce image descriptors and then converting those descriptors to integers using a Hyperplane LSH (Locality Sensitivity Hashing) process. This process ensures that different images produce different hashes.

The embedding network represents images as real-valued vectors and ensures that perceptually and semantically similar images have close descriptors in the sense of angular distance or cosine similarity. Perceptually and semantically different images have descriptors farther apart, which results in larger angular distances. The Hyperplane LSH process then converts descriptors to unique hash values as integers.
and
The main purpose of the hash is to ensure that identical and visually similar images result in the same hash,

...so ignore the technicalities (which aren't technical enough to recreate and critique the process) and focus on how terms like "number unique to the image" or "identical" have gradually morphed via "nearly identical" and "perceptually and semantically similar" into "visually similar"... and that we're suddenly talking about analysing the features of the image (which is precisely what some people here are saying isn't happening "because hash").

Then we follow up with the truly impressive and reassuring demonstration that a colour picture of a palm tree generates the same hash as exactly the same image converted to monochrome but a completely different cityscape (with nary a palm tree in sight) generates a different hash. Wow. Anybody reading this critically would be asking "what about a different picture containing palm trees, or maybe a similarly composed picture of a cypress tree? How about some examples of cropped/resized images which couldn't be spotted by simply turning the image to B&W before hashing?" Maybe the system can cope with that - if so, why not show it rather than a trivial Sesame Street "one of these three things is not the same" example?

I'm not questioning whether the technology makes a good effort at a very difficult task (matching images without being fooled by inconsequential changes) but the summary reeks of "positive spin" and avoiding the difficult questions: and for any technology like this the #1 question has to be "what are the dangers of a false match" and is the risk justified by the rate of successful matches?

...and will people please, please stop saying "it's not scanning your images, it's only checking hashes" - that's a distinction without a difference even before you replace "hashes" with Apple(R) NeuralHashes(TM).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.