Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Until:

A POS you have as a friend on a social media site downloads the picture you posted of your kid in the bath. He/she then includes that in a batch of CSAM photos he/she trades online for other CSAM images. Your photo hash gets collected and distributed by NCMEC. Now, you have CSAM material on your phone.

or

Someone wanting to harm you does the same and posts your photo on a CSAM forum, knowing it will be distributed and the hash will be logged.
Do you honestly believe that the pics that are hashed are of babies taking baths?? Or any other seemingly "innocent" photo like that?

I don't even want to imagine what the hashed pics are and have the utmost respect for the people that choose to do this type of work to stop people spreading these types of pics.
 
Do you honestly believe that the pics that are hashed are of babies taking baths?? Or any other seemingly "innocent" photo like that?

I don't even want to imagine what the hashed pics are and have the utmost respect for the people that choose to do this type of work to stop people spreading these types of pics.

You might find this thread on hackernews interesting - how to create dummy images that match the hash:
https://news.ycombinator.com/item?id=28106867
 
And sorry, the argument that Apple could abuse this system if they wanted to doesn't fly since if they wanted to they could backdoor your phone with the next update and nobody would notice until the damage is done. The same is true for Android, or literally any non-open source operating system running on any networked device. Everyone who's the least bit informed knows this and has either widely accepted it, or has opted out of using closed hard- and software. Alternatives do exist. They may not be pretty, and may sometimes be impractical, but they do exist.

The question is: is it in Apple's interest to do this? Is Apple willing to risk its reputation, which is the foundation of them earning three digit billion dollar sums annually? The answer is: no. It is not. If widespread abuse of this kind would become known - and it would become known, since hardly any company has so many eyes on them and what they do - it would cost Apple tens of billions of dollars.

This.
1000x this.

God people are dense.

Nothing has changed, any non-open source system could already be “pressured” by Xi or the FBI behind the scenes. You can be sure those actors were already acutely aware of this.
24/7 background file indexing/AI_analysis/metadata/etc. has been a thing forever.
Low power always-on CPU cores, always-on motion coprocessors, always-on mics, always-on location services likewise.
Unfathomably powerful Neural Engines in every device.

There’s not a particularly new “cat” out of the bag.
Just a specific use case for comparing hashes in privacy minded way with cryptographic checks and balances.

It could be abused ”at gunpoint” like anything else could be abused at gunpoint.
 
  • Like
Reactions: ohio.emt
Do you honestly believe that the pics that are hashed are of babies taking baths?? Or any other seemingly "innocent" photo like that?

I don't even want to imagine what the hashed pics are and have the utmost respect for the people that choose to do this type of work to stop people spreading these types of pics.
I’m a sworn LEO doing mobile forensics that deals with CSAM for a living. Yes. I do believe it.
 
Can someone explain why Apple thinks it's their job to make sure you're not a pedophile?

They're literally going to spy on you to make sure you're not as pedophile.

How is this within the scope of their business model?

I seriously cannot get my head around this.

It reminds me of the preachers who keep telling you being gay is bad, while they're secretly cheating on their wife with Geoffrey the stable boy.
 
Yea a false positive of a photo taken by somebody in private that is now being viewed by somebody that has no right to look at it.

Being struck by a lightning is 2 million times more probable.
A single photo isn’t enough.
You missed a lot of pieces of the puzzle.
 
A step too far. Control control control. Sometimes you just have to let people be people and let the chips fall. The real purveyors of child abuse are not going to be caught out by this. Yes it seems like a noble endeavour but the road to hell is paved with good intentions. Systems like these will 100% be adapted at some time down the line and used to target and apprehend those committing wrongthink.
 
You might find this thread on hackernews interesting - how to create dummy images that match the hash:
https://news.ycombinator.com/item?id=28106867
You might want to understand that you would need access to the hashed images to even create a dummy image.

So, what is the scenario here? I download an illegal and hashed image used in the database to cacth child pornographers so I can create a similar hased image? And then what...somehow get my enemy to download the image so it is uploaded to their icoud? And then what...do this multiple times so their account is flagged and reviewed by an Apple employee to see that the images are in fact NOT illegal images from the database?

What is your point and what is the purpose of "hacking" hashed images to create fake ones? And again, this is forgetting the fact that one would need to be in possesion of illegal child pornography to make your inane post even relevant in this thread.
 
  • Like
Reactions: ohio.emt
They implemented it in such a way that can easily be used to scan for copyright infringement as well.

Get the hash of that Ted Lasso web rip on TPB and boom, find that web rip on an iPhone somewhere, boom you have caught a "criminal". For a company moving more and more into the direction of distributing their own online content, this is a very lucrative avenue.

Where is this blind trust coming from?

So they’re telling us they can do it, asking for our consent in update EULA this fall, instead of just doing it like they always could?
You know other cloud storage hosts take down copyrighted material, right?

Let’s get enraged if and when they actually do it for ACTUALLY OFFLINE stuff. (i.e. not stuff you are about to upload on their iCloud servers anyway)

Until then, it’s just “stuff can happen”.
 
For your lame example, I would say that means child pornographers would get away with the system, not innocent people getting flagged incorrectly.
No that means that there is zero assurance what is gonna happen and how the process work. By your logic lets put all the males in jail and than we will sort out who can walk free.
 
Until:

A POS you have as a friend on a social media site downloads the picture you posted of your kid in the bath. He/she then includes that in a batch of CSAM photos he/she trades online for other CSAM images. Your photo hash gets collected and distributed by NCMEC. Now, you have CSAM material on your phone.

or

Someone wanting to harm you does the same and posts your photo on a CSAM forum, knowing it will be distributed and the hash will be logged.

But doesn't the NCMEC check each photo before it gets added to the CSAM database? They've gotta be pretty sure about the nature of the photo before it gets hashed.

So I don't think my "baby-in-the-bathtub" photo will be flagged as child porn in the first place.

And if, for some reason, my douchebag friend narcs me out and the cops search my phone... the only photos they will find are my innocent personal photos.

I hear what you're saying... but it's really reaching.

¯\_(ツ)_/¯
 
  • Like
Reactions: MozMan68
No that means that there is zero assurance what is gonna happen and how the process work. By your logic lets put all the males in jail and than we will sort out who can walk free.
They issued multiple white papers and detailed overviews on how the system works and protects the innocent. If you don;t believe it, that's your right.
 
  • Like
Reactions: Michael Scrip
Self-hosted cloud storage is an interesting option. Storing "Notes" and "Photos" in the cloud isn't rocket science - but existing solutions aren't great. From poking around on NextCloud it's an interesting option, especially if services offered turnkey solutions.
I self host using sinology NAS. Its not cheap but its 100x more secure than using any cloud service (privacy wise)
 
Because Apple promised never to look at our data. The other companies are in the data processing business.
In order to rule out false negatives, a human review is in place. That person will have to see your photos, and it‘s gonna be an employee.
If it‘s a correct positive I think everyone is onboard with all the consequences, but if it‘s a false positive, that person has been subject of surveillance. By a privacy-screaming brand.

1 in a trillion chance.
Multiple offences needed.
Not looking at numbers is anti-vaxxer grade stuff.

Plus, the image included in the security voucher (unlocked ONLY after multiple offences, basically impossible to happen by accident) is a low-res version of your pic.

Check your facts.
 
Can someone explain why Apple thinks it's their job to make sure you're not a pedophile?

They're literally going to spy on you to make sure you're not as pedophile.

How is this within the scope of their business model?

I seriously cannot get my head around this.

It reminds me of the preachers who keep telling you being gay is bad, while they're secretly cheating on their wife with Geoffrey the stable boy.
Your tortured analogy only reinforces the fact that you have no idea what you are angry about, and even less of an idea what you are talking about.

Read the description Apple released. Learn that the issue is much more limited and nuanced than you think it is.
 
I self host using sinology NAS. Its not cheap but its 100x more secure than using any cloud service (privacy wise)

That's helpful, thanks! If you don't mind sharing, what files do you keep there, how well does it integrate into Mac OS and iOS?

I really like how Notes and Photos integrate into iCloud. Trying to understand how I can replace that....perhaps Sinilogy has its own Notes and Photos apps on iOS?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.