Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I have to say, I find it really disconcerting that this is the issue that suddenly so many people think Apple is overstepping. A million things you let go. But Apple decides to start doing what ever other major provider does, and not allowing iMessage to continue to be a safe haven for child porn...and that generates outrage?

The reaction alone tells me this is probably something Apple really does need to do.
Your points came off quite intelligently so I have no problem believing that you understand quite well that tackling CP is not the issue everyone is upset over.

Not only are consumers upset with this "update", there are many in Apple that are grumbling about this as well.
 
In short, Apple introduces a system which is known beforehand to be completely useless for its intended purpose. So what is Apple’s real intention?
To protect Apple.

Some authority says, "hey! You aren't doing enough to protect/prevent this material being shared!"

Apple can say that they created this thing.
 
I do wonder about this. That doesn't mean we shouldn't do anything possible to slow down / hinder / catch child predators. But if it can be turned off anyway (unlike server side scanning), how much does it truly help? Doesn't that essentially make it somewhat of an opt-out feature from the criminal's mindset?
That is exactly my point. What most here appear not to realize is that we are arguing for or against a system that is useless in the first place.
 
To protect Apple.

Some authority says, "hey! You aren't doing enough to protect/prevent this material being shared!"

Apple can say that they created this thing.

While that authority is on the phone...

"So...about this tool you have to scan users data on their phone and look for matches against a database...."
 
  • Like
Reactions: AstonSmith
But that is the one point that is so clear in the white paper that I was having difficulty with you understanding…the hash method is so precise, simply taking a pic of a baby in a bath and having it match a database image is so astronimaclly high, much less than more than one, it shouldn’t be an area of concern.

The odds of a false positive also being a picture of an unclothed child (or an unclothed anyone) are even more astronomical. Since it's a hash collision and not image analysis, it's just as likely to be a picture of a cat.

Meanwhile, there's a nearly 100% chance of accurately locating any iPhone user in meat-space. If the so-called privacy mavens worring about a never-gonna-happen false positive wanted something to fret over, that'd be it.
 
  • Like
Reactions: giggles
The odds of a false positive also being a picture of an unclothed child (or an unclothed anyone) are even more astronomical. Since it's a hash collision and not image analysis, it's just as likely to be a picture of a cat.

Meanwhile, there's a nearly 100% chance of accurately locating any iPhone user in meat-space. If the so-called privacy mavens worring about a never-gonna-happen false positive wanted something to fret over, that'd be it.

But any modifications to an offending image are flagged. If this was a 1:1 hash match, it wouldn’t be as confusing.
 
While that authority is on the phone...

"So...about this tool you have to scan users data on their phone and look for matches against a database...."
Oh yes, I can't argue against the possibility of feature creep.

I'm not particularly in love with the idea of this battery drain running on the phone. This sort of thing is why devices get slower every year. I'd rather it stayed in the cloud.
 
I have to say, I find it really disconcerting that this is the issue that suddenly so many people think Apple is overstepping. A million things you let go. But Apple decides to start doing what ever other major provider does, and not allowing iMessage to continue to be a safe haven for child porn...and that generates outrage?
Well, for me it's them using my phone for their search. I already treat anything public was untrusted, but my phone wasn't public. That's the difference. I'm not worried about getting caught, I rarely take photos and almost all of those are machines and serial numbers -- it's just the idea of *them* using the phone for their own purposes and putting in a back door that will make it easier to expand what they're looking for, all the while saying that user privacy is important to them!
 
Well, for me it's them using my phone for their search. I already treat anything public was untrusted, but my phone wasn't public. That's the difference. I'm not worried about getting caught, I rarely take photos and almost all of those are machines and serial numbers -- it's just the idea of *them* using the phone for their own purposes and putting in a back door that will make it easier to expand what they're looking for, all the while saying that user privacy is important to them!

Correct.
This sounds more like their phone than mine now -- maybe it should be free if this is the way forward?
hah
 
But any modifications to an offending image are flagged. If this was a 1:1 hash match, it wouldn’t be as confusing.

FFS…🤦‍♂️

Some things, people either get them or they don’t….you can’t make a brain elastic enough to understand that even a modified version of the offending image could still retain some parameters that a trained AI could still statistically match to the parameters of the original image micro-areas, and that on the contrary a completely unrelated image wouldn’t (in 99.9999999999% of cases), and that there’s no contradiction in this.

Doubt can be a powerful tool to learn more or can make you look dumb if you try to run before you can walk…“clueless doubting” is a pandemic nowadays (literally) on social media…
 
Well I don’t think that’s fair. I don’t use Facebook and Twitter for other privacy things too.
It’s very fair…Apple already tracks you (and your photos) 50 other ways and no one has created an uproar like this With those features.

Please tell me how this is different (or even worse than) Apple tracking you for traffic info? Reading your photos to see if you want it to identify a plant or a dog breed? Analyzing your routing data in Maps to improve directions.

There are on device and off device privacy levels added for those items that aren’t even close to the level and steps they have added for this particular process, yet we just accept them as great enhancements to make our lives easier.
 
All three FAQ questions could/should actually be answered with:

"Practically speaking yes, and if we were forced to do so by a government entity you wouldn't know."

This is the problem.

Thats the problem with GOogle, Microsoft, Amazon, Facebook, Oracle. All of whom are doing the same. but for some reason the freak out is with Apple scanning child porn.
 
FFS…🤦‍♂️

Some things, people either get them or they don’t….you can’t make a brain elastic enough to understand that even a modified version of the offending image could still retain some parameters that a trained AI could still statistically match to the parameters of the original image micro-areas, and that on the contrary a completely unrelated image wouldn’t (in 99.9999999999% of cases), and that there’s no contradiction in this.

Doubt can be a powerful tool to learn more or can make you look dumb if you try to run before you can walk…“clueless doubting” is a pandemic nowadays (literally) on social media…

Again. Apple must clearly be overstating the modifications part. I can turn a bathtub picture into a firey hell scene. Would that get flagged? Are we only talking about a small crops and small adjustments? If I turn the subject into the Hulk and have him fight Iron Man in the bath, would that still get flagged? How much modifications are we talking about here? THAT is what I don’t know.
 
Thats the problem with GOogle, Microsoft, Amazon, Facebook, Oracle. All of whom are doing the same. but for some reason the freak out is with Apple scanning child porn.

1. The problem is them doing scanning on our own devices.

2. Apple has gone out of their way to brand their approaches as "more private" than the other big Corps. It's not unreasonable for people to be upset when finding out it was mostly a facade.
 
  • Like
Reactions: 09872738
It’s not illegal to have images of illegal activity - it is illegal to possess child pornography.
It's legal to have say pics/vids of yourself, say a frat boy, sexually assaulting women? :rolleyes: 🤔

Or take dressingroom/upskirt pics? Last I checked that was still all illegal and photos are evidence of your crime (or someone else's)

It's a moral choice by Apple here so that nobody can say you support the pedos; no one can oppose them doing this in principle. Very buzzword hot button issue to finally go after 10 years after icloud was launched. Why now suddenly 10 years later?

The old "won't someone please think of the children" argument.
 
  • Like
Reactions: Dj64Mk7
Again. Apple must clearly be overstating the modifications part. I can turn a bathtub picture into a firey hell scene. Would that get flagged? Are we only talking about a small crops and small adjustments? If I turn the subject into the Hulk and have him fight Iron Man in the bath, would that still get flagged? How much modifications are we talking about here? THAT is what I don’t know.

Why would a pedo modify the pics from his collection to that extent, turning the content into something completely different?
Of course they’re talking about the kind of modification that is made purely to throw off the hashing comparison system.
 
It's legal to have say pics/vids of yourself, say a frat boy, sexually assaulting women? :rolleyes: 🤔

Or take dressingroom/upskirt pics? Last I checked that was still all illegal and photos are evidence of your crime (or someone else's)

I'm not sure what you're advocating or not here.

I'm obviously not ok with any of those examples.

But - I'm also VERY much not ok with now just "scanning for evidence of it everywhere on everyone/anyones devices at all times"

That is a dystopian police state.
No thank you.

The ends don't justify the means on going about dealing with crime in that way.
 
  • Like
Reactions: 09872738
Why would a pedo modify the pics from his collection to that extent, turning the content into something completely different?
Of course they’re talking about the kind of modification that is made purely to throw off the hashing comparison system.

So you are saying the modifications I explained won’t cause a collision?
 
So you are saying the modifications I explained won’t cause a collision?

Of course not?
You think the big brains behind this multi-million dollar system want to catch a picture of the Hulk just because originally that file was kiddie p0rn? They want to catch cropped, rotated, color shifted, blurred, black and white, etc. versions of the offending pic.
 
Of course not?
You think the big brains behind this multi-million dollar system want to catch a picture of the Hulk just because originally that file was kiddie p0rn? They want to catch cropped, rotated, color shifted, blurred, black and white, etc. versions of the offending pic.

Well it was just an example. But that does clear things up if it’s only edits that are that minor. Thanks!
 
In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities.

Apple will refuse any such demands

So in the first case they say they are obligated to follow the law. In the second they say they will refuse any demand to search for any other kind of images. But when that request becomes a law they will simply switch to the Apple is obligated to report line.

This is a pandora's box they won't be able to control.
 
Well it was just an example. But that does clear things up if it’s only edits that are that minor. Thanks!
Maybe it’s just me, but I’m starting to get creeped out that your constant examples are from the pedo point of view….wondering if the illegal image will still get matched if it is altered in one way or another.

Maybe think of better examples to give as I’m sure you are just trying to understand the tech…but seriously…stop using that kind of example.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.