Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why shouldn't police look into people grooming children?
The constitution. You know, probably cause stuff for a warrant. Innocent until proven guilty.

That grooming children sounds like you're supporting turning parents in for taking pictures of their kids...
 
  • Like
Reactions: IG88
The constitution. You know, probably cause stuff for a warrant. Innocent until proven guilty.

That grooming children sounds like you're supporting turning parents in for taking pictures of their kids...
The constitution doesn’t cover your interactions with private businesses, and especially doesn’t cover when you explicitly agree to the terms of service and license agreement ahead of time allowing them to search the photos before uploading.
 
I think you misunderstand the point I was trying to make. The point is computer scientists have already defeated the system by creating a way of moodifying images so that (1) they still look the same to the human eye and (2) they are not detected by the kind of system Apple's proposing, which would necessitate Apple loosening its criterion for matching, thereby elevating seriously the number of false positives.
No, I get your point. Computer scientists haven’t defeated the system, they just created something that will flag a match for an algorithm they created (notably not Apple’s, one they created themselves that we’re to ‘assume’ is functionally similar to Apple’s). That’s like me making a lock, then using the information I used to build the lock to create a key for it. That doesn’t mean I’ve defeated my lock. It STILL works as a lock and the door can’t be opened without a key. I’ve done nothing more than create another key that unlocks the lock (which is exactly how locks are expected to work).

Any computer scientist that COULDN’T fool a system they created themselves wouldn’t be very good computer scientists. And, for the ones that did, I’m sure they’re enjoying the links to their work. :)
 
The constitution doesn’t cover your interactions with private businesses, and especially doesn’t cover when you explicitly agree to the terms of service and license agreement ahead of time allowing them to search the photos before uploading.
As I said in my message, once it's reported to the government, it is a problem.

But anyway, businesses doing it is supposed to make me feel okay about it? I trust businesses even less than the government.
 
  • Like
Reactions: BurgDog
As I said in my message, once it's reported to the government, it is a problem.

But anyway, businesses doing it is supposed to make me feel okay about it? I trust businesses even less than the government.
They’re not reporting it to the government. They’re reporting, after human evaluation to confirm it’s child porn, to a private entity.
 
Last edited by a moderator:
Good enough? Look at the ImageNet dataset, you'll find more. Also look at deserts, the old Google AI back in 2016/2017 (not neural hash based) incorrectly classified some desert images as porn images. Turns out some of these desert images have the same neural hash as actual porn images. You can easily find the desert images, the matching porn images, you have to come up with yourself. ;)
Um, this is what I was assuming you’d present. :) An image with a visibly single color background (i.e. very little color variance information) is EXACTLY the type of image someone would create if they’re trying VERY hard to create a match where one doesn’t exist normally. So, no, not hardly good enough. Especially not that example of the axe. There’s nothing “normal” about the mangling of the aliasing on it (just like there was nothing normal about the REALLY odd dithering of the prior examples). If these are the best examples they have, then this content matching method (admittedly not Apple’s, one they created) is actually pretty solid. Especially considering they had a vested interest in creating it such that it would allow impressive false matches… and these are the best they arrived at.
Yeah, only some people sitting in dark offices looking at private data. What's on peoples private devices is no ones business.
Same could be said for the FBI knocking at your door with a full squad and taking you in for questioning while your neighbors are watching. Damage is done, even if you walk free.
No, the same couldn’t be said for the FBI knocking at your door while your neighbors are watching. Because, some people sitting in dark offices looking at private data are not doing so with your neighbors as onlookers. :) You should probably be aware that algorithms to detect financial fraud are similarly running against every credit transaction anyone performs. They run the check, comes up with a high likelihood that it’s valid, and no further outcome. This is far more like that.
 
The constitution. You know, probably cause stuff for a warrant. Innocent until proven guilty.

That grooming children sounds like you're supporting turning parents in for taking pictures of their kids...
You don’t think there’s a difference between parents taking pictures of their kids and predators grooming children? I’d trust the NCMEC about this kind of stuff more than I would most.
 
  • Like
Reactions: WolfSnap
This sounds 100% reasonable. The only people who are complaining are either 1) pedophiles, or 2) people who can't understand what the system does/how it works.
For anyone that keeps CP off their devices, I’d be willing to pay 100% of the legal fees if they get false matches that get reported and as a result, get taken to court for those false matches. No low dollar lawyer, either, I’d source the BEST and we’d ALL end up fairly wealthy.

*offer void if it’s found that they actually have CP*
 
  • Love
Reactions: WolfSnap
For anyone that keeps CP off their devices, I’d be willing to pay 100% of the legal fees if they get false matches that get reported and as a result, get taken to court for those false matches. No low dollar lawyer, either, I’d source the BEST and we’d ALL end up fairly wealthy.

*offer void if it’s found that they actually have CP*
I’d happily throw some money into that pot too to defend a false positive. But, as you said, the offer is null and void if you do have child porn.

Willing to bet I never have to spend a penny based on this post.
 
  • Like
Reactions: Unregistered 4U
They’re not reporting it to the government. They’re reporting, after human evaluation to confirm it’s child porn, to a private entity.
False, it eventually gets reported to the government. (otherwise, what good would it do from your POV?)
 
Last edited by a moderator:
  • Like
Reactions: BurgDog
You don’t think there’s a difference between parents taking pictures of their kids and predators grooming children? I’d trust the NCMEC about this kind of stuff more than I would most.
Not enough of one to make sure one's illegal, one is not. I don't trust the MCMEC with anything given the statistics.
 
I'm not worried about getting reported, I'm worried about someone getting reported falsely and illegally.
And, how exactly would that happen? There’s human review, and reviews at NCMEC. Besides, that’s what our justice system is for — to ensure only guilty people are punished for crimes actually committed.
 
Last edited by a moderator:
And, how exactly would that happen? There’s human review, and reviews at NCMEC.
That's already been discussed numerous times.

Besides, that’s what our justice system is for — to ensure only guilty people are punished for crimes actually committed.
A false accusation can be just as life altering as if you did something wrong. Lose your job, most likely, and that would be the least of your problems even if the justice system works, which it clearly doesn't always.
 
Last edited by a moderator:
Um, this is what I was assuming you’d present. :) An image with a visibly single color background (i.e. very little color variance information) is EXACTLY the type of image someone would create if they’re trying VERY hard to create a match where one doesn’t exist normally.
They are real images, whether you like them or not is irrelevant. Other ImageNet image are real as well and match. Did you even try the desert images match vs real porn image matches?
Again, they don't even have to be real. Spreading created images is enough. This has been discussed over and over, from S&P to DFRWS and CODASPY. No need to repeat it, you just don't agree with world leading experts, it's not the end of the world.
You should probably be aware that algorithms to detect financial fraud are similarly running against every credit transaction anyone performs. They run the check, comes up with a high likelihood that it’s valid, and no further outcome. This is far more like that.
Different thing, also not running on devices. I'm well aware of how these work, published a few AI related papers on that topic.
 
An interesting article and possibly/likely true. It is important to remember that this is an article that someone states Apple told them this and not a statement from Apple directly. Do you have anything that is from Apple that states this? Apple will sometimes dispute claims that are false in the press but often they do not address them either.

In January 2020, Apple senior privacy officer Jane Horvath told a panel at CES in Las Vegas that the company used specialist software to automatically check iPhone images backed up to iCloud for signs of child abuse images.
 
Encrypt your emails and use services with E2EE for storage, problem solved. Won't happen for iCloud photos now.
But as I said before, Apples decision and users responsibility how to handle it. Security research group down the hall went from "many people using Apple" to "let's buy just enough to keep research going".
 
You don’t think there’s a difference between parents taking pictures of their kids and predators grooming children? I’d trust the NCMEC about this kind of stuff more than I would most.
Please explain what are the negative consequences for a bs SAR submitted by NCMEC? What group has oversight over what they do?

In 2019, NCMEC sent 8028 "Suspicious Activity Reports" to Swiss FedPol. Only 693 of those even made it to the referral stage. The Swiss don't say how many of the 693 resulted in prosecutions. Even if it was all of them, that's less than 10% of NCMEC SARs being deemed worthy of a criminal referral.

91% of what NCMEC submitted was not deemed criminal, and you trust these people's judgement? You trust your device to be compared to a database whose creators are wrong 91% of the time?

If 91% of a police officer's arrests were tossed by judges for being bs, people would be losing their minds. But NCMEC gets a pass because #JustBeingCautious #ForTheChildren #SorryNoOversightIsPossible
 
Last edited:
You trust your device to be compared to a database whose creators are wrong 91% of the time?
Many people trust anything, particularly fan boys. Many people here believe this type of scan is the holy grail and helps everyone and especially the children. A former colleague, CCIE and CCDE, teaches for Cisco, at various universities and working with several government agencies and authorities around the world is often tasked with securing forensic evidence in CSAM cases can only roll his eyes when this topic comes up. This type of scan will virtually do nothing. The absolute morons will continue to upload CP to their facebook account and will be caught. The consequences for the rest are as described.
 
  • Like
Reactions: IG88 and BurgDog
Please explain what are the negative consequences for a bs SAR submitted by NCMEC? What group has oversight over what they do?

In 2019, NCMEC sent 8028 "Suspicious Activity Reports" to Swiss FedPol. Only 693 of those even made it to the referral stage. The Swiss don't say how many of the 693 resulted in prosecutions. Even if it was all of them, that's less than 10% of NCMEC SARs being deemed worthy of a criminal referral.

91% of what NCMEC submitted was not deemed criminal, and you trust these people's judgement? You trust your device to be compared to a database whose creators are wrong 91% of the time?

If 91% of a police officer's arrests were tossed by judges for being bs, people would be losing their minds. But NCMEC gets a pass because #JustBeingCautious #ForTheChildren #SorryNoOversightIsPossible
As stated earlier, 63% of what was referred was explicit material, the rest were suspicious ages, apparent child grooming, etc that they felt warranted investigation. The fact that the police felt almost all weren't a problem is a police decision, the reasons for which we aren't privy to.
 
These two articles kind of contradict each other. However I was really interested in this article that actually has some video around the 2 minute mark of Jane Horvath speaking and she doesn’t (in the video) talk about them scanning photos in the cloud but many news site quoted her saying so. All of the news sites speculate Apple is using Microsoft’s system for hashes.

I’d like to see the whole panel but ultimately it only verifies Apple doesn’t want to be scanning your stuff on their servers but instead have it be done on device and something the user can turn off inhibiting the CP from getting in their servers in the first place.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.