Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So I guess those Apple employees have no issues with child abuse?

Disgusting stance that so many are defending.
See, and this is what happens with the "Think of the children!" mindset.
You're blinded by the idea that it's protecting children.
And children *should* be protected.
However, the 4th amendment makes this pretty clear....
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

Apple has no business playing the "overwatchers", checking everyone's photos, violating privacy, on the hopes that they catch someone. If the FBI goes to Apple and says "We have a warrant to check this person's pictures", that's fine. Hand them over. But apple should NOT be proactive in scanning and reporting people's photos. They can certainly be reactive.

And I agree with the other posters who said "Scan them before they're stored in iCloud". That's fine. I can get behind that, because Apple doesn't want to be responsible for storing illegal materials. Meanwhile though, my phone drains battery fast enough without background scans of my photos "for the children".
 
See, and this is what happens with the "Think of the children!" mindset.
You're blinded by the idea that it's protecting children.
And children *should* be protected.
However, the 4th amendment makes this pretty clear....
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

Apple has no business playing the "overwatchers", checking everyone's photos, violating privacy, on the hopes that they catch someone. If the FBI goes to Apple and says "We have a warrant to check this person's pictures", that's fine. Hand them over. But apple should NOT be proactive in scanning and reporting people's photos. They can certainly be reactive.

And I agree with the other posters who said "Scan them before they're stored in iCloud". That's fine. I can get behind that, because Apple doesn't want to be responsible for storing illegal materials. Meanwhile though, my phone drains battery fast enough without background scans of my photos "for the children".
Your assuming that your definition of "unreasonable" is the same as to what they are doing here.
 
This is interesting…private parties searching your device are immune to the 4th amendment.

Case file for OneDrive and PhotoDNA

“The Court denies Bohannon's motion. First, the Fourth Amendment does not apply to Microsoft's search, as Microsoft was not acting as a government entity or agent. Even if the Fourth Amendment applied, Bohannon consented to Microsoft's search by agreeing to Microsoft's terms of service.” United States v. Bohannon, Case No. 19-cr-00039-CRB-1, 5-6 (N.D. Cal. Dec. 11, 2020)
 


Apple employees are now joining the choir of individuals raising concerns over Apple's plans to scan iPhone users' photo libraries for CSAM or child sexual abuse material, reportedly speaking out internally about how the technology could be used to scan users' photos for other types of content, according to a report from Reuters.

apple-park-drone-june-2018-2.jpg

According to Reuters, an unspecified number of Apple employees have taken to internal Slack channels to raise concerns over CSAM detection. Specifically, employees are concerned that governments could force Apple to use the technology for censorship by finding content other than CSAM. Some employees are worried that Apple is damaging its industry-leading privacy reputation.
Apple employees in roles pertaining to user security are not thought to have been part of the internal protest, according to the report.

Ever since its announcement last week, Apple has been bombarded with criticism over its CSAM detection plans, which are still expected to roll out with iOS 15 and iPadOS 15 this fall. Concerns mainly revolve around how the technology could present a slippery slope for future implementations by oppressive governments and regimes.

Apple has firmly pushed back against the idea that the on-device technology used for detecting CSAM material could be used for any other purpose. In a published FAQ document, the company says it will vehemently refuse any such demand by governments.
An open letter criticizing Apple and calling upon the company to immediately halt it's plan to deploy CSAM detection has gained more than 7,000 signatures at the time of writing. The head of WhatsApp has also weighed into the debate.

Article Link: Apple Employees Internally Raising Concerns Over CSAM Detection Plans
One of the best possible outcomes would be for people to realize how much privacy they have lost to tech companies who have become maid-servants to government (in a certain sense). This is the start of the rise of fascism.

If Apple doing this in iOS 15,you better believe this is coming to macOS as well.

Perhaps this will push people back to using dSLRs and third party photo management software.
 
That's not how it works. Consider Google's reverse image search, you can flip the image, rotate it, resize it, and even slightly modify it, and still be able to determine if it's visually similar enough to be the same image. Given that the CSAM dataset is a subset of "All images on the internet", the dataset is smaller, and you can do even more computationally heavy processing to catch even more variance. Even more so considering the CPU cycles can be run overnight on your phone, and not at Apple's expense.

Ultimately this means that an image like the Tiananmen Square image, cropped, resized, embedded with a block of text, would still get flagged as wrong-think, and the CCP police would come knocking.
Then that’s not hashing, that’s feature detection algorithms (which is more worrying than hashes). I’m fairly certain that with enough modification, an image will become different enough to fool even feature detection.
 
But it would? If you change something even slightly it would change the hash of the image. You can’t have something resistant to hash changes because that’s how hashes work. Now, unless they aren’t using hashes or using something in addition to hashing, it’s impossible to not have a hash that doesn’t change when editing the file. If that was the case, elements of cryptography would be nonexistent.

You could even modify the file or run it through third party modification software and it would be classed as a ‘different’ file.
No. That's a completely false statement. There are many ways of computing a hash that don't amount to a trivial operation on a bit array. The hash algorithm used is one that evaluates the essence of a photo using visual analysis techniques, not a trivial checksum of file content. See the technical discussion on Microsoft's version, Photo DNA, for example.
 
If you don't see a problem with this technology. Apple is definitely playing you. Pretty sure you don't want Apple to be looking/scanning/analyzing/identifying your wife's pictures. This CSAM stuff needs to be shut down. Apple needs to respect our privacy period. Apple needs to figure out another way if they are really interested in catching pedophiles... God knows for what reason. It really feels like this is the end of an era. All good things comes to an end.

View attachment 1818079
Seems it from your earlier post you don’t have a problem technology either. You have a problem with what it potentially could be and the problem of scanning your personal non-CSM synced or matched photos. The technology itself you don’t have a problem with it it’s implementation with Apple. Let’s clarify that first
 
So I guess those Apple employees have no issues with child abuse?

Disgusting stance that so many are defending.
Oh get over yourself. Do you REALLY think that perverts are going to think “oh no Apple are making it hard now, I’d better stop thinking like a dirty scumbag now”. It’s NOT going to alter child abuse 1%. It will just make the scum go somewhere else whilst decent folk have their rights and privacy compromised even more.
 
Good, but sorry it's too late for me to stay with Apple.

I'm done with big tech. Bought a Nokia 5310 (2020) for calls and texts. That'll do.

I also have a wifi-only degoogled android for some apps when I'm at the house.

We'll see how it goes. I may take the degoogled android as the main phone in future, but for now, I'm going low-tech.
DeGoogled?.

Did googled for some abs if the phone is running android from Google you’re being tracked buddy if you say then you’re being tracked.

cylaxOS is what you should use.
 
Oh get over yourself. Do you REALLY think that perverts are going to think “oh no Apple are making it hard now, I’d better stop thinking like a dirty scumbag now”. It’s NOT going to alter child abuse 1%. It will just make the scum go somewhere else.
If you haven’t been catching up on the specific events there’s nowhere else to go unless you get a dumb feature phone Google Microsoft many others already scanned all photos and I haven’t reported it to anybody else we’re all just finding out now over a decade later.
 
That is correct but a physical Apple employee (that is a human being) will be reviewing the pictures once it gets flagged. That does not sound like a AI to me.

View attachment 1818083
A human being world review the data not the pictures please when you start making these comments make sure you have them accurate. Seams are getting very emotional over it, and making small false statements.
that said I don’t think you have the right to ask for somebody or to see it so he should be fired because you disagree with the technology. May be a better suggestion would be maybe rollback or take a look at the technology and look at what it does first or how it’s implemented it’s a sound cars it’s a just cars how it’s implemented that’s the issue shouldn’t be asking for some guy that wants to make a positive change but doing it wrong to be fired because you disagree.
 
If you haven’t been catching up on the specific events there’s nowhere else to go unless you get a dumb feature phone Google Microsoft many others already scanned all photos and I haven’t reported it to anybody else we’re all just finding out now over a decade later.
Yep cause buying a £50 hard drive and keeping them on there instead is IMPOSSIBLE I tell you it’s impossible. You are presuming that this will make paedophilic scum will stop once they can’t share pics. No they are more likely to go and do it in person. Be careful what you agree to and wish for.
 
Don't turn on iCloud Photo Library. But quite honestly, as all this detects is child pornography, it makes me wonder why you wouldn't want it turned on.
The problem is not what it is looking for now, but the concept of being scanned and where it can/will be expanded. I’m sure you don’t do anything illegal in your home, but probably don’t want to have someone come look through it everyday to check.
 
  • Like
Reactions: Violet_Antelope
No. That's a completely false statement. There are many ways of computing a hash that don't amount to a trivial operation on a bit array. The hash algorithm used is one that evaluates the essence of a photo using visual analysis techniques, not a trivial checksum of file content. See the technical discussion on Microsoft's version, Photo DNA, for example.
As per my latest comment, that’s not a hash algorithm then, that’s feature detection algorithms such as the one you mentioned. That is much worse than simply comparing image hashes, but yes, it is more accurate and harder to spoof. However, it is more open to feature abuse.
 
It's pretty clear how that works if you take 5 seconds to do a Google search....or Craig clarified in the updated story added today. Broad overview of course, but an organization that has been doing this successfully for 2 decades with multiple toll gates in place should be trusted on some level I think.

Or you can choose not to trust anything or anyone and assume everyone is out to get you. Have fun with that...
They’re not out to get me

They’re building the Panopticon
 
Yep cause buying a £50 hard drive and keeping them on there instead is IMPOSSIBLE I tell you it’s impossible. You are presuming that this will make paedophilic scum will stop once they can’t share pics. No they are more likely to go and do it in person. Be careful what you agree to and wish for.
They will just fed ex flash drives.

Regular old fashioned police work, a supportive social agency, a disgusted citizenry, and an engaged judiciary using realistic sentencing are the tools to stop child abuse/trafficking.

It’s really odd that Apple suddenly has become cyber-Chris Hansen….
 
If you really are worried about this just don’t use iCloud 🤔🤔
The ability to scan and the hash list is included in iOS. I don't want that ability on my phone.

Apple said in 2015 that a backdoor for the FBI would be "the software equivalent of cancer". What they are building right now is basically the software equivalent of herpes. Sure it isn't always an active infection, but its always there, lurking in the cells, ready to be activated again. And there's no way to get rid of it.
 
Can someone explain how this tech works? I find face scanning and photo regensition tech to get it wrong a fair amount of time. Is this going to flag pictures that might be an issue and upload them for someone to look at? People take a lot of nude photos, so is it going to be scanning for nude photos it thinks are young people and sending them for a person to look at? I don't get how it's supposed to work.
 
File names don't come into it. Also Apples implementation compensates for crops, tweaks and resizes to the original file. Still using hashes.
You just explained that there is some leeway to the system. If it was a 1:1 match, it would be much better received. But since they have some estimations going on its worrying quite a bit of people. I know one mid 20s married couple that is long distance most of the time and shares some adult images over iMessage that are quite concerned with this.
 
Apple is not trying to catch anybody. Apple is only complying with the law to not allow illegal materials to be stored in their servers.

IMHO, this will lead to E2EE for iCloud Photos, which is currently still stored in iCloud where Apple has the keys to decrypt the contents. Once Apple can prove to the authorities they have a meaningful way to prevent unlawful materials being stored in their servers, they can proceed to enable E2EE.
They already do this on iCloud. So why move it to our devices? What does it gain when we have iCloud enabled to have this scan run?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.