Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Thats nice for icloud, but doesn‘t address the csam hash-checking function running locally on machines with mac OS 10.15 & newer. If your computer comes with a built in quiet little snoop that doesn't alert you of the presence of illegal material, but instead just alerts the federal police and wrecks your life, thats probably something everyone with teenagers should consider very deeply.
Apart from the fact that:
(A) Apple publicly announced they would not move forward with that solution, and
(B) A least one independent investigation verified above to be unlikely to be part of the current macOS.
 
What worried me about all this is times when you shot a photo of your toddlers in the bath (or something else just as similarly innocuous) and ten minutes later your pad was being raided by cops. AI isn't smart enough yet to know the difference.
So you never bothered to read how this actually works and just worried about something for the wrong reasons 🧐
 
Good. And besides, if the NSA, Mossad, et al. really wanted to access someone’s data, they would find a way to do it. CP wouldn’t be on their radar anyway, but my point is where there’s a will there’s a way, even if you need to pay a sexy spy to love bomb the target into giving you the password, or secretly develop a trillion dollar AI to crack encryption we currently believe to be uncrackable.
 
If I rent a locker and store porn is the locker own liable. Why would apple be liable in any way. This just isn’t the case. It’s not a worry at all.
Apple would be liable because they are expressly saying there is no need for them to carry out CSAM detection giving excuses as to why. Various vocal groups are saying Apple needs to implement CSAM detection on icloud and Apple have turned around and basically said no. Therefore if CSAM material is found on icloud by the police due to the apprehension of criminals and their investigations, Apple could be held liable for knowing such a thing takes place and refusing to implement something that would have prevented the unlawful material from appearing on their icloud in the first place.
 
So is this why Apple will know longer let Siri search and identify your pictures by location, object, and date? I mean, you can do that with spotlight search and you can search in the Photos app. So why not let Siri search and make it easire on everybody who, like me, have thousands of pictures.
 
There were users on here who kept saying "You are not an Apple engineer" and "Apple knows better" and "Protect the children" and "What do you have to hide." Well, apparently everybody knew better than Apple, including the inventor of this technology, who called it "dangerous." And the present Apple knows better than the past Apple. Apple is not always right. Sometimes even the ordinary commonsense users know better.
 
I didn't say censor the material, just notify that the material is illegal as defined in practically every country on earth. It would be quite a leap to go from that to "that screenshot contains a racist joke, you should feel terrible".

Not everything created with good intentions is destined to be weaponised. That same logic would see us banning knives because they can be used to harm people.
Knives over a certain size (or using a specific method of operation like switchblades) are totally illegal. Because they’re primarily designed to hurt people.
 
Whats strange is this was all known from Apple when they defied the FBI request to unlock the San Bernadino iPhones years before. This isn't a new stance at all, it was more that the plan they had for inspecting photos went against their apparent stance.
 
  • Disagree
Reactions: FindingAvalon
Even ignoring the risks of this feature, it would not have been all that useful in accomplishing its desired goal. It would only be triggered when someone would download an image that was listed in the database of bad images, and then upload that image to iCloud (likely using Photos). Anything stored on-device would not be scanned at all.

Edit for clarity: Hashing the image happens on device, but this would only happen for files that would be uploaded to iCloud.
 
Last edited:
The big wigs at Apple and the owner of this website should be extremely ashamed and embarrassed that there are Apple customers and MR members who would rather see the continuing of child abuse and child exploitation rather than have an effective system put into place to catch such behavior because that is what happens every time a vile image is uploaded to the icloud, a child continuing to be abused. CSAM would be able to flag the image, Apple would be able to report to law enforcement where the image was uploaded from, the time and date. This would allow law enforcement to then carry out investigations with the eventual result of the person who uploaded the image arrested. That person could then disclose where they got the image from leading to a domino effect of arresting one after the other in the chain of abuse eventually leading up to the ring leader who abused children and created the image in the first place. But none of that will happen because many Apple customers and many MR members value their privacy far to much. What is the well known saying 'If it saves one child life then it is worth it'. Not for many MR members is does not. Hell would have to freeze over before they agree to any encroachment into their privacy.
 
Unfortunately a lot more people are involved than you would think. You'll be able to find reports from people that have accessed the "dark web", to find thousands of communities with thousands of members, all circulating all kinds of awful material.

Worth a watch if you want to erode your remaining faith in humanity:
Again, it's easy to be sympathetic to "monitoring" (whatever that means) for illicit Child Exploitation images; thus weakening privacy.

It's the unintended consequences of weakening privacy and putting in place a censor's wet dream. Before you know it, the Party would define what you are allowed to think about.

There's plenty of laws already in place to deal with people who have exploited technology to access repugnant images. It is not up to companies such as Apple to do the government's snooping for them.
 
Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.

They get it! It is so refreshing to hear real talk from Apple, reflecting that they really did research this issue earnestly and listen to feedback from all sides. They are in a tough spot but it’s similar to the subject of backdoors in encryption. They want to help but they know they can’t break the whole system to do it.
 
  • Like
Reactions: Shamgar and gusmula
Is there a deep hidden meaning to this? Basically is Apple saying in not so many words that if they were to create software to scan for CSAM in icloud that it could fall into the hands of data thieves who would exploit it's use and thus to prevent such a thing happening they have no intention of creating the software in the first place? Much like the argument Apple use for not making security backdoors into iphones because they are worried making such a thing will fall into the hands of criminals who would exploit it's use and therefore it is better to say no such thing exists.
It's not really about "data thieves." The part left unsaid is basically this:
If Apple creates the infrastructure to scan user data in iCloud Photos for CSAM in a "privacy oriented way" as decribed here:
("the system will perform on-device matching against a database of known CSAM image hashes provided by the NCMEC and other child safety organizations)
It opens the door to Apple being compelled to allow user data to be scanned against various other hashes for content deemed inappropriate or undesirable by the government of any country that is a major market for Apple (cough China under the CCP cough)

This would've created an insane amount of moral hazard for Apple, and put vulnerable users in authoritarian countries at risk. Honestly speaking, in comparison, the risk to people in first world democracies was relatively small (albeit still not insignificant, particularly with the rise of would be populists/authoritarians in many countries) in comparison.
 
  • Like
Reactions: Shamgar and gusmula
Apple just need to pray that criminals that are involved with child abuse and exploitation do not use icloud for their ill gotten gains because if the police catch the criminals and find images on Apple's icloud, the crap will hit the fan because Apple are saying their current systems for finding child abuse media on icloud are already robust and thus they do not need to build CSAM detection into icloud but if the police were to find images it would prove Apple's stance on the issue is baseless and cause a huge backlash against Apple because many would then be saying if Apple had implemented CSAM like they was asked to, the images the police find would not have been there.

First of all, it’s up to law enforcement agencies to track down child abusers. It’s not Apple’s job to put battery sapping software on their customer devices just so they don’t have to spend the money on cloud scanning software.

Secondly, full stops are your friend.
 
The problem are gag orders in the US and other countries. If it is technically possible to scan any cloud data, Apple would be forced to do that very frequently and Apple managers would face prison sentences if they even talk about that. So the only way is to NOT make it technically possible.

Imagine you commit a crime. If a friend of you knows about it, he could face charges, if he lies about that to the police. So better not tell him anything. If a friend of mine commits a crime, I do not want to know about it. That is the same principle.

And of course if the US could force Apple to decrypt data, China, Turkey or Saudi Arabia could do the same.
 
  • Wow
Reactions: gusmula
What worried me about all this is times when you shot a photo of your toddlers in the bath (or something else just as similarly innocuous) and ten minutes later your pad was being raided by cops. AI isn't smart enough yet to know the difference.

For this to happen, then a picture of your toddler would have to have already shown up in the CSAM database, in which case you already have a bigger problem than your lack of understanding of how this actually works.
 
OK, but “fuzzy hashes” are still used by Cloudflare and freely available to all Cloudflare users AND hashes cannot be reverse-engineered to reveal the original content… AND… those raw hashes are calculated and stored on upload to ensure the file(s) transferred w/o corruption; so saying it introduced a vector for compromise literally makes zero sense, but I am sure it will appease the people taking up arms on a non-issue.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.