Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Although I think child porn is horrible and should be prevented and people arrested for it - it is the fact that Apple is going from providing technology to the world to it becoming a policeman for the world that bothers me. Stick to great technology and leave the rest to something else.
What about all the other services out there like FaceBook, Instagram, Amazon, Google etc? Are they exempt from "policing" what is stored on their servers?
 
  • Haha
Reactions: dk001
If I need to place my trust in the hands of Apple, Google, MSFT, FB or Twitter et al, I am very happy to choose Apple. Anyone who is in the business of owning or distributing CSAM material, they are not likely making use of any sort of automatic cloud storage. Think about what many of you are saying, do we really want to make Apple the "platform of choice" for these sort of things rather than "hurt the feelings" of a few who feel betrayed because Apple has chosen to address this evil? The rest of the conspiracy theory talk is just silly, unless you are some public figure, what reason would anyone have to target a joe blow individual with the sort of high-level hacking that is being discussed.
 
I just came across a very neat method to destroy the life of an unwanted person with the help of Apples new Control bureau:

- Install "Pegasus" spyware on the phone of that person
- send a bunch of the illicit pictures to that person with help of "Pegasus"
- switch on iCloud photos

this is immediately possible with iOS15 - no other filetypes have to be declared "illegal" by some dictator.
sit back and wait for the result... (get some popcorn)
away that person is.

Very clever method.
Why use Pegasus? Whatsapp photos automatically get plunked into your Photos library on your iPhone. If that person has iCloud Photo sync turned on, they'll automatically have a real bad day in iOS15.
 
  • Wow
  • Like
Reactions: pdoherty and dk001
Why use Pegasus? Whatsapp photos automatically get plunked into your Photos library on your iPhone. If that person has iCloud Photo sync turned on, they'll automatically have a real bad day in iOS15.
you can switch it off.
… but Pegasus can probably switch it on without telling you.
 
Why use Pegasus? Whatsapp photos automatically get plunked into your Photos library on your iPhone. If that person has iCloud Photo sync turned on, they'll automatically have a real bad day in iOS15.
Same thing that can happen right now!
 
But they could 😉

Isn't that what everyone is into these days anyway? The what-ifs and hypotheticals? 😉

Yes they could. Honestly surprised they haven’t.

Kind of. There is plenty of “what if” and “possibly could be” floating around mixed with a helping of “OMG”.

There is also genuine concern that this “solution” has not been adequately thought through.
 
There is also genuine concern that this “solution” has not been adequately thought through.
I get that, but why would Apple take such a serious thing so lightly and not think of all the possible ways it could backfire? That's why they've built in so many security checks and the hybrid on-device/server implementation.
 
I get that, but why would Apple take such a serious thing so lightly and not think of all the possible ways it could backfire? That's why they've built in so many security checks and the hybrid on-device/server implementation.

Apple is not perfect. That is why projects have hypercare and bug fixes.
The one aspect Apple has yet to really explain is why client side. Just basic risk analysis shows areas of high risk with this type of solution that Apple has not addressed as far as we know or they have indicated.

I am not saying Apple is wrong. I am saying is that this needs additional investigation and discussion before trying this live. As additional folks look into this, example is the current EFF, Reddit (reverse engineer and spoofing), and other discussions, it doesn’t hurt to take another look and make sure that this is the correct path forward AND if followed Apple gets it right. That audience needs to be more than just Apple.
 
  • Like
Reactions: Mega ST and Pummers
Have an alert setup when a new vid on this shows up.
This one was kind of … an eye opener if factual.
In order to trick Apple, you would need a lot of things.

1. You would need the actual NeuroHash of a CSAM file which would be highly illegal to obtain.
2. You would have to use software to generate a collision to match that hash
3. Apple's server re-checks the image by unencrypting it and generating a perceptual hash that's different from the NeuroHash and compares it to CSAM.
4. Then if the perceptual hash matches the CSAM image, then it's passed to human review.
5. Then Apple looks at it and decides what to do next. They either verify that it's CSAM or they discard it if it somehow got through all the other checks.

That's a lot of steps to try and trick the system.

It's going to be interesting when a bunch of people try and "trick the system" when iOS 15 comes out, but it won't work because the perceptual hash won't match the faked image.
 
  • Like
Reactions: BigMcGuire
In order to trick Apple, you would need a lot of things.

1. You would need the actual NeuroHash of a CSAM file which would be highly illegal to obtain.
2. You would have to use software to generate a collision to match that hash
3. Apple's server re-checks the image by unencrypting it and generating a perceptual hash that's different from the NeuroHash and compares it to CSAM.
4. Then if the perceptual hash matches the CSAM image, then it's passed to human review.
5. Then Apple looks at it and decides what to do next. They either verify that it's CSAM or they discard it if it somehow got through all the other checks.

That's a lot of steps to try and trick the system.

It's going to be interesting when a bunch of people try and "trick the system" when iOS 15 comes out, but it won't work because the perceptual hash won't match the faked image.

Gotta luv your #1 - “… highly illegal …”
As if that has ever stopped a criminal.
btw - that database has to exist in many places in part or whole. If not, why do you need to scan for it?

You can continue to downplay potential issues all you want if that makes you feel comfortable and allows you sleep at night (or day).

Personally I would rather have it varified by an accredited non-Apple entity or three.
 
  • Like
Reactions: Pummers and Schismz
Gotta luv your #1 - “… highly illegal …”
As if that has ever stopped a criminal.
btw - that database has to exist in many places in part or whole. If not, why do you need to scan for it?

You can continue to downplay potential issues all you want if that makes you feel comfortable and allows you sleep at night (or day).

Personally I would rather have it varified by an accredited non-Apple entity or three.
Even if a criminal were to make innocent photos with the same exact hash, they’re still rescanned on Apples server to rule out any false positives.
 
  • Like
Reactions: hans1972
In order to trick Apple, you would need a lot of things.

1. You would need the actual NeuroHash of a CSAM file which would be highly illegal to obtain.
2. You would have to use software to generate a collision to match that hash
3. Apple's server re-checks the image by unencrypting it and generating a perceptual hash that's different from the NeuroHash and compares it to CSAM.
4. Then if the perceptual hash matches the CSAM image, then it's passed to human review.
5. Then Apple looks at it and decides what to do next. They either verify that it's CSAM or they discard it if it somehow got through all the other checks.

That's a lot of steps to try and trick the system.

It's going to be interesting when a bunch of people try and "trick the system" when iOS 15 comes out, but it won't work because the perceptual hash won't match the faked image.
It's obvious that you didn't even watch the video. He discusses #1 in it.
 
  • Like
Reactions: dk001
Gotta luv your #1 - “… highly illegal …”
As if that has ever stopped a criminal.
btw - that database has to exist in many places in part or whole. If not, why do you need to scan for it?

You can continue to downplay potential issues all you want if that makes you feel comfortable and allows you sleep at night (or day).

Personally I would rather have it varified by an accredited non-Apple entity or three.

But the guy in the Youtube video didn't understand how the whole system works. He missed an extremely important part.
 
But the guy in the Youtube video didn't understand how the whole system works. He missed an extremely important part.

Then again Yannic was only looking at the underlying hash functionality and how is could be misused or gotten around. Something that was initally portrayed as very secure in its function.
 
Last edited:
Then again Yannic was only looking at the underlying hash functionality and how is could be misused or gotten around. Something that was initally portrayed as very secure in its function.
We also don’t have our hands on the final code. I’m sure there have been tweaks to it since 14.3 but I’m not positive.

However, I do know that there are safeguards in place just for this fake match situation.
 
We also don’t have our hands on the final code. I’m sure there have been tweaks to it since 14.3 but I’m not positive.

However, I do know that there are safeguards in place just for this fake match situation.

That is a big ??? - is this the same code?
Good chance but need to be verified.

The safeguards were supposed to help validate the hash matching. Not designed to handle handle deliberate misinformation or hacking.

These items alone indicate more discussion is needed before the final decision on rolling this out.
 
Last edited:
  • Like
Reactions: Pummers
I asked myself that question. You already can do that, but this is not "on device". This opens a new dimension, as it is triggered just the moment you upload it. And there is "proof on device" as a bonus...

I am not the panic guy, but there are things, you have to stop before they even start.
And: I am of no interest to anyone, and do not own illicit material (ok, some hollywood movies from somewere...), but I still do not like the idea of this control mechanism.

Getting the Pegasus software isn't easy.

Also, why not just post the picture to sosial networks? They will report and it could be devastating for their public image.

Why go the roundabout way when you can post it to Facebook, Instagram and Twitter?
 
Then again Yannic was only looking at the underlying hash functionality and how is could be misused or gotten around. Something that was initally portrayed as very secure in its function.

So he wasn't even discussion #"5. Then Apple looks at it and decides what to do next. They either verify that it's CSAM or they discard it if it somehow got through all the other checks." then.

Apple uses two hash technologies. It seems he wasn't looking at the last one.
 
  • Like
Reactions: Jayson A
So he wasn't even discussion #"5. Then Apple looks at it and decides what to do next. They either verify that it's CSAM or they discard it if it somehow got through all the other checks." then.

Apple uses two hash technologies. It seems he wasn't looking at the last one.
Yeah, 2 separate hashes are generated. There's no way of knowing what the second hash is going to be, so it would be nearly impossible to trick the system even if you did have the entire list of CSAM NeuroHashes.

This is going to be interesting when iOS 15 drops. I bet "experts" are going to try to exploit the system to make Apple look like liars.... I don't think it will work, but time will tell.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.