Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It scans your whole library and they report you if anything is flagged as suspicious? I thought this was just for filtering graphic content out of iMessages. How is that not a huge violation of privacy? If I have an image of some creepy naked cherub angel a religious relative sent me years ago am I going to be reported to the FBI? What about all those kids in bath tubs and naked baby pictures that new moms spam us with in group messages?

I know I’m 100% deleting everything on iCloud and will be getting off of Apple’s platform if that’s what this means and I didn’t misunderstand. I’m not one for slippery slope arguments so I’m not going to make one, though I have the feeling this is about to be tech that China will love, but this just screams big brother.
That isn’t how it works. Have you actually read anything here?
 
  • Like
Reactions: MozMan68
In the short term maybe not but even before this, companies are getting fed up with the walled garden (Epic, Spotify), I know lots of friends/family that have moved to Android and I stupidly was trying to get them back to Apple for iMessage and stuff like that. Short term loss (not much), long term loss (very likely imo)
Are you aware of apples financials? Whatever people say it do apple is raking in the cash. This won’t change anything.
 
Isn't the police's job to catch criminals, like paedophiles?


This is starting to look like Minority Report with Tom Cruise. Frightening.

What is the next step when scanning iOS and macOS? Your political views? Other crimes?


Strange when Apple was recently unable to open up an iPhone that a terrorist had for the FBI, then Apple cherished privacy.

But now Apple will scan each device regardless of "guilty" or "innocent."
What's the next step for Apple? An Apple employee comes home to every household and makes sure you don't have anything inappropriate at home?
 
Then how does it work? The article and everything I’ve googled sounds like it flags images and then sends it for human review so you’re left at the mercy of some random Apple employee.
It scans the hashes of your pics against a database of already known images of child abuse. Assuming you don’t have these already known pics in your possession, none of your pics will be flagged. The chances of a human looking at your pics is astronomically small it may as well not exist,
 
  • Like
Reactions: Hastings101
Of course not even those who are buying old phones and don’t update to iOS 15 won’t change anything. It’s still going to be scanning and has done so for many years

That doesn't address my question.

It's either a Yes or a No.

Easy.
 
No. The hash algorithm in question is designed to be resistant to basic image manipulation. In particular, steganography, which typically doesn't change the appearance of an image, would not have any effect on the image hash.
But it would? If you change something even slightly it would change the hash of the image. You can’t have something resistant to hash changes because that’s how hashes work. Now, unless they aren’t using hashes or using something in addition to hashing, it’s impossible to not have a hash that doesn’t change when editing the file. If that was the case, elements of cryptography would be nonexistent.

You could even modify the file or run it through third party modification software and it would be classed as a ‘different’ file.
 

Enough is Enough, Apple! It is incredibly distasteful for you to SPY on us, as consumers. Apple it will be in your best interest to give us the Opt-Out option from CSAM feature, please.​


Where are you, Craig? You said this yourself in WWDC - 2021. What happened?

“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”

This is a punishment and a big slap in the face for anyone who owns an iPhone. Whatever you are trying to accomplish with this Apple, leave it in government and law enforcement hands.

1. How about CSAM scans Apple's executives' iPhones first? No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our PRIVACY. It is our fundamental human right.

2. How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to the release date of iOS 15...


Also, this guy needs to be FIRED from Apple. He is the mastermind behind CSAM. What a joke!

View attachment 1818081

Does anyone here have a game plan on how we can stop this crappy CSAM feature?​

Switch to android who will be following this next year but in the meantime time just switch to android
 
Apple's already been caught secretly recording and screening our private conversations, so why is everyone up in arms over THIS??

Nothing about it was ever secret. Why do people keep repeating the same false claims?
 
That is correct but a physical Apple employee (that is a human being) will be reviewing the pictures once it gets flagged. That does not sound like a AI to me.

View attachment 1818083
The only problem with that report from Apple is its completely untrue. It cannot accurately predict based on only have access to NCMEC database, as they would need a comprehensive database from all crime agencies, both in the USA and abroad. THEY DO NOT HAVE ACCESS TO THEM!

I'm glad employees are kicking off as they recognise that their future employment relies upon Apple's stance on privacy for users and ensuring surveillance doesn't take place, let alone Apple doing it!

More worrying is that this SURVEILLANCE is built into the operating system and operates on a USER'S HARDWARE prior to moving to iCloud. I can see many court cases here, as Apple will be expecting users who have paid for their hardware, who pay for the processing that comes with the machine, and also paying for the electricity of their equipment and effectively Apple are usurping that equipment etc., without the express permission of the user.

Mr Cook would have received my communication, so knows how I feel about it...

A fag packet idea in the name of child safety, whether it was well intentioned or a deliberate ploy with which to engage in surveillance I don't know, but I do know it is SURVEILLANCE, it is a breach of their espoused and much repeated comments on privacy and surveillance.

Apple: Stop this farce now, before it costs you a customer base.

When you get what you think is a lightbulb moment that turns out to be crap, its best not to go forward with it and then try and justify it, because the reasons its raised so many concerns, including may I add from certain agencies involved in fighting child abuse, is because apart from being surveillance, it doesn't work and will make the agencies entrusted in fighting these crimes have a much harder job, as those engaged in this awful child abuse will simply go underground, tor network, vpn and encrypted files.

This is nothing to do with child abuse, its SURVEILLANCE, and once you get round the emotive excuse for this, you should not hesitate to make your representations known to Apple, to your governments or even Trading associations.

Had my say, done what I can so far and my colleagues have done likewise.

Remember all that is necessary for evil to prevail is for good people to do nothing.

Its rather akin to a Minority Report situation. I can only reiterate my first post about this when the news came out.

First they came for the 'suspect' Children's Pictures
And I did not speak out
Because I was not a Child Abuser

Then they came for the 'suspect' Adult pictures
And I did not speak out
Because my pictures were not those

Then they came for 'suspect' Animal Abuse pictures
And I did not speak out
Because I was not an Animal Abuser

Then they came for 'suspect' Law Breakers
And I did not speak out
Because I was not a Law Breaker

Then they came to control Everyone's Data
And there was no one left who could speak out
Not even Me!
 
Last edited:
If you wanted to watch and own CSAM content, would you take the risk of modifying each pic and then upload it to iCloud anyway? No, so it definitely will work as a deterrent. Unfortunately, they will simply move elsewhere.
Yes, it is a deterrent but my point was about the technical security control in place.
 
  • Like
Reactions: hagar
Also Apples implementation compensates for crops, tweaks and resizes to the original file. Still using hashes.
Apple's NeuralHash algorithm extracts features from the picture and computes a hash in a way that ensures that perceptually and semantically similar images get similar fingerprints. Apple so far has not told how flexible this system is, but its aims go well beyond traditional pixel-based hash matching.

Ah, so these are not just hashes, they are Apple NeuralHash hashes.

...and there's the devil in the details, which, if true, means that people are being disingenuous in saying that "it's only taking a hash of the image"... which makes it sound like they're taking the data in the photo to calculate a single value unique to the image (save for a vanishingly small - and mathematically calculable - probability of a 'hash collision') which is looked up in a table. Which would be the typical meaning of "hashing" in computing. Of course, think a bit more deeply and that would be pretty much useless, because any trivial change to the image would give a different hash (which is often the whole point behind common uses of hashes - such as checking the md5 of a downloaded file, or as part of a blockchain system).

"in a way that ensures that perceptually and semantically similar images get similar fingerprints" takes this firmly into machine learning/AI territory - especially if the "Neural" bit means that neural networking techniques are used (...which means that there *isn't* a well-defined algorithm that can be checked - just a machine learning system that has been "trained" with a finite set of data).

Nobody ever oversold the accuracy of the AI system that they were selling, right?

So such a match doesn't mean that two images are identical within a 1-in-a-trillion chance of a hash collision. It's a fuzzy match based on some heuristic measurement of "perceptual and semantic similarity". Only pedantry separates that, in principle, from face matching or nudity detection. I'm sure hashing is a common technique in AI/ML - that doesn't mean the result is "just a hash".

...plus, it means that if the algorithm incorrectly matches an innocent photo in your collection, then the chances of you having other "perceptually and semantically similar" photos are significantly increased. So the "only multiple matches will trigger a flag" assurance is worthless. There have been horrible miscarriages of justice resulting from people reasoning that "two one in a million chances = one in a trillion" without knowing that they are assuming that the events are independent,

...which make it even more important what criteria the "human reviewers" will be applying, whether they've got their Statistics 202 (Statistics 101 is just as likely to re-enforce the misconceptions) and how much discretion they'll have in deciding if an image is offending, when many of the "false matches" they see won't be totally random, but will have been flagged because they contain something that resembles the offending image.
 
If you don't see a problem with this technology. Apple is definitely playing you. Pretty sure you don't want Apple to be looking/scanning/analyzing/identifying your wife's pictures. This CSAM stuff needs to be shut down. Apple needs to respect our privacy period. Apple needs to figure out another way if they are really interested in catching pedophiles... God knows for what reason.

View attachment 1818079
Regardless of what I think of the feature, it totally doesn’t work the way you believe it does.
 
  • Like
Reactions: MozMan68
Could governments force Apple to add non-CSAM images to the hash list?
"Apple will refuse any such demands."

As if the US doesn't have a secret document that they could pull out of a drawer that says Apple has to do as the American government says and if they tell anyone they go to prison for 25+ years...
Nice paranoid fantasy.
In reality they have on multiple occasions refused requests by the FBI and other agencies to do things like that and taken a lot of heat for it. But sure, Apple has been secretly doing the governments bidding this whole time, and decided to publicly announce this particular feature because?
If Apple wanted to secretly spy on people THEY WOULDNT HAVE ANNOUNCED IT.
 
  • Disagree
  • Like
Reactions: rme and JBGoode
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.