Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Wait. I’m not understanding something. Can someone please explain this to me.

I’m about to become a father in 2 weeks. There are going to be pictures being taken of my child, some clothes and some “nude”. All parents take pictures of their babies just being babies.

So am I going to be arrested because I took a picture of my nude baby? Will I be considered a pedo because I took a pic of my own baby in the nude?

Please tell me I’m misunderstanding this whole thing because if that is the case then I’m burning my iPhone to ashes first thing in the morning.

Please can someone tell me I’m overreacting because I’m stupid and don’t understand this whole thing.

Thanks.
 
  • Like
Reactions: pdoherty
Wait. I’m not understanding something. Can someone please explain this to me.

I’m about to become a father in 2 weeks. There are going to be pictures being taken of my child, some clothes and some “nude”. All parents take pictures of their babies just being babies.

So am I going to be arrested because I took a picture of my nude baby? Will I be considered a pedo because I took a pic of my own baby in the nude?

Please tell me I’m misunderstanding this whole thing because if that is the case then I’m burning my iPhone to ashes first thing in the morning.

Please can someone tell me I’m overreacting because I’m stupid and don’t understand this whole thing.

Thanks.

You are fine.
This is only for photos that have been placed in a special CSAM database by NCMEC or Legal Authorities.
 
Wait. I’m not understanding something. Can someone please explain this to me.

I’m about to become a father in 2 weeks. There are going to be pictures being taken of my child, some clothes and some “nude”. All parents take pictures of their babies just being babies.

So am I going to be arrested because I took a picture of my nude baby? Will I be considered a pedo because I took a pic of my own baby in the nude?

Please tell me I’m misunderstanding this whole thing because if that is the case then I’m burning my iPhone to ashes first thing in the morning.

Please can someone tell me I’m overreacting because I’m stupid and don’t understand this whole thing.

Thanks.
It's not going to work that way.

Come iOS 15, your phone is only going to be comparing photos you would upload to iCloud against photos of child pornography already existing in the government agency's database. This is scanned in a manner such that your phone is able to identify whether they contain any content of such a nature, without actually knowing what your images are about.

So the only way your phone could be flagged is if photos of your newborn child (congrats by the way) somehow end up on their database, the chances of which will be zero for all intents and purposes.

Continue to take photos of your child for keepsakes. It's not going to bring about any issues whatsoever.
 
It's not going to work that way.

Come iOS 15, your phone is only going to be comparing photos you would upload to iCloud against photos of child pornography already existing in the government agency's database. This is scanned in a manner such that your phone is able to identify whether they contain any content of such a nature, without actually knowing what your images are about.

So the only way your phone could be flagged is if photos of your newborn child (congrats by the way) somehow end up on their database, the chances of which will be zero for all intents and purposes.

Continue to take photos of your child for keepsakes. It's not going to bring about any issues whatsoever.

Thank you for the explanation. I almost had heart failure lol.

And thanks! I’m very excited but nervous. So much crap (literally) I’m going to have to clean up :/

While I do see a benefit for what Apple is doing (there’s a special place in hell for pedo’s), it does concern me that Apple can suddenly change their own rules and decide to invade my privacy.
 
  • Like
Reactions: Pummers and dk001
Seriously...someone give me one realistic example of how on-device hashing can be abuse. Not sci-fi fantasy...step by step on how someone could abuse it.

I'll even let you assume that someone can break into the phone and alter the hashes and what they can tag. What next?

And again, remember that Apple already does on-device analytics that are way more open than this....
NCMEC is indirectly run by the US government and provides Apple with the hash database. Apple cannot legally create the hash database from the raw material. Not too hard for them to slip in hashes of political or anti-government content, and the only thing protecting us from that are the human reviewers at Apple looking at icon-sized representations of the images actually caring about their jobs and not just blindly confirming hits.
 
  • Like
Reactions: pdoherty and LinusR
NCMEC is indirectly run by the US government and provides Apple with the hash database. Apple cannot legally create the hash database from the raw material. Not too hard for them to slip in hashes of political or anti-government content, and the only thing protecting us from that are the human reviewers at Apple looking at icon-sized representations of the images actually caring about their jobs and not just blindly confirming hits.

Unless they have a Fisa that says they have to co-op?
It is likely a matter of when, not if. Sadly.
 
NCMEC is indirectly run by the US government and provides Apple with the hash database. Apple cannot legally create the hash database from the raw material. Not too hard for them to slip in hashes of political or anti-government content, and the only thing protecting us from that are the human reviewers at Apple looking at icon-sized representations of the images actually caring about their jobs and not just blindly confirming hits.

So someone is going to add hashed pictures that are NOT child pornography to the database…got it…

and assuming that could even happen…you now have to match 30 of those pictures on your phone…

….okay…gajjilion to one shot…but sure, because as we know, I might have the exact pics they’re looking for on the phone.

And then of course, I upload them to iCloud…I mean, I have iCloud turn on, so actually, the only plausible part of this scenario.

AND THEN…when it’s reviewed by Apple at that point and none of the pics are of child pornography, but match whatever the government was looking for, that Apple employee is obviously a secret plant from the government and then notifies their handler to let them know who I am so they can send the black van to my house and haul me in never to be seen again.

You’re right…totally plausible scenario. Thanks for educating me.
 
Unless they have a Fisa that says they have to co-op?
It is likely a matter of when, not if. Sadly.
They have the capacity to scan every image on their cloud servers like every other cloud company. What's stopping them from scanning for anti-government images now?
 
  • Like
Reactions: dk001
They have the capacity to scan every image on their cloud servers like every other cloud company. What's stopping them from scanning for anti-government images now?
That's what I was thinking, but then people just say "Well, I'll turn off iCloud Photos then", but that's exactly what you can do in iOS 15 too.
 
They have the capacity to scan every image on their cloud servers like every other cloud company. What's stopping them from scanning for anti-government images now?

No idea. In 2019 Apple talked about it. Its in the EULA that they can. But they never did outside of warrants / subpeonas.
It is odd in comparison to all the others that did. Maybe it is just the fact they are not required to do it?
 
Spent some time on Reddit reading the various posts on this topic.
WOW! :oops:

Everything from “It doesn’t exist” to Trump/Musk are forcing Apple to “It’s the end of the world! Nuke Apple” to “Think of the children!”.
This forum is so very mild in comparison.
 
I just came across a very neat method to destroy the life of an unwanted person with the help of Apples new Control bureau:

- Install "Pegasus" spyware on the phone of that person
- send a bunch of the illicit pictures to that person with help of "Pegasus"
- switch on iCloud photos

this is immediately possible with iOS15 - no other filetypes have to be declared "illegal" by some dictator.
sit back and wait for the result... (get some popcorn)
away that person is.

Very clever method.
 
  • Like
Reactions: pdoherty and dk001
I just came across a very neat method to destroy the life of an unwanted person with the help of Apples new Control bureau:

- Install "Pegasus" spyware on the phone of that person
- send a bunch of the illicit pictures to that person with help of "Pegasus"
- switch on iCloud photos

this is immediately possible with iOS15 - no other filetypes have to be declared "illegal" by some dictator.
sit back and wait for the result... (get some popcorn)
away that person is.

Very clever method.
Has this already happened with any other cloud services provider that already scans material saved on their servers?
 
Has this already happened with any other cloud services provider that already scans material saved on their servers?
I asked myself that question. You already can do that, but this is not "on device". This opens a new dimension, as it is triggered just the moment you upload it. And there is "proof on device" as a bonus...

I am not the panic guy, but there are things, you have to stop before they even start.
And: I am of no interest to anyone, and do not own illicit material (ok, some hollywood movies from somewere...), but I still do not like the idea of this control mechanism.
 
  • Like
Reactions: pdoherty and dk001
I just came across a very neat method to destroy the life of an unwanted person with the help of Apples new Control bureau:

- Install "Pegasus" spyware on the phone of that person
- send a bunch of the illicit pictures to that person with help of "Pegasus"
- switch on iCloud photos

this is immediately possible with iOS15 - no other filetypes have to be declared "illegal" by some dictator.
sit back and wait for the result... (get some popcorn)
away that person is.

Very clever method.

Wouldn't the spyware be detected on the phone as well as its installation date rendering everything found suspect?
 
I asked myself that question. You already can do that, but this is not "on device". This opens a new dimension, as it is triggered just the moment you upload it. And there is "proof on device" as a bonus...
All material uploaded to Google's servers originates "on device." That device may be your own Android phone and I imagine Google scans everything immediately upon upload. Again, has your example already happened with any other cloud services provider that already scans material saved on their servers?
 
Wouldn't the spyware be detected on the phone as well as its installation date rendering everything found suspect?
As I wrote, I am not the panic guy. But this "Pegasus" created quite some noise in the world recently. So this additional trigger is a marvellous "plugin" for it. The big problem of "Pegasus" was (or is) it did not get detected...

Just one more piece in the big puzzle...
 
I just came across a very neat method to destroy the life of an unwanted person with the help of Apples new Control bureau:

- Install "Pegasus" spyware on the phone of that person
- send a bunch of the illicit pictures to that person with help of "Pegasus"
- switch on iCloud photos

this is immediately possible with iOS15 - no other filetypes have to be declared "illegal" by some dictator.
sit back and wait for the result... (get some popcorn)
away that person is.

Very clever method.

That first step is the critical piece ;)
 
I asked myself that question. You already can do that, but this is not "on device". This opens a new dimension, as it is triggered just the moment you upload it. And there is "proof on device" as a bonus...

I think the question asked implied this has been possible for several years on the iPhone, so why haven't we heard about it?

So how would you do it?

1. Use some method as you describe to secretly (how?) add images to Photo Library
2. iPhone user is using Google Photos, probably the world's most used image library
3. Google Photos automatically uploads every photo to Google's cloud servers
4. Google scans Google Photo in the cloud using the same CSAM hashes
5. Google reports to NCMEC

Have you any evidence this has occurred?
 
Although I think child porn is horrible and should be prevented and people arrested for it - it is the fact that Apple is going from providing technology to the world to it becoming a policeman for the world that bothers me. Stick to great technology and leave the rest to something else.
 
  • Like
Reactions: pdoherty and dk001
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.