Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Any perspective where any trillion dollar tech company isn’t already owned by the IC is laughable at this point. This type of stuff is just marketing to cover how they use what they already know in their self-serving wars.
 
  • Like
Reactions: DanTSX
We don’t. That’s why I said ‘probably”. But I‘d like somebody that knows stats to weigh in and explain how Apple can say one in one trillion with any confidence.
You seemed to have suggested that it has never occurred, so how do we know the odds. I am saying how would we know if it never occurred? It could flag false positives all the time and not be reported.
 
No. It doesn't work this way. It's scanning for known child sexual abuse images using hashes to identify those photos. It's not going to flag your personal photo.
Did you actually read the article lol?

Johns Hopkins cryptography researcher Matthew Green said that CSAM scanning is a "really bad idea" because in the future, it could expand to scanning end-to-end encrypted photos rather than just content that's uploaded to iCloud. For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in iMessages, which are end-to-end encrypted.

Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.
 
  • Like
  • Love
Reactions: -DMN- and DanTSX
Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.

This is the real issue, even before privacy.


If you're not doing anything wrong, then you have nothing to worry about.

So not true. If you have never been falsely accused, you have no idea the horror of being falsely accused of something as sinister as child pornography or sexual abuse. Pray it never happens to you.
 
Any perspective where any trillion dollar tech company isn’t already owned by the IC is laughable at this point. This type of stuff is just marketing to cover how they use what they already know in their self-serving wars.


Worth repeating. Re-read this.
 
  • Like
Reactions: smarch
You seemed to have suggested that it has never occurred, so how do we know the odds. I am saying how would we know if it never occurred? It could flag false positives all the time and not be reported.
It’s from the Apple technical document I linked.
 
Well talk that to people in oppressive governments. “Wrong” can be very subjective. Speaking ill of your government can get you arrested ie that Olympic athlete from Belarus.


It’s not limited to government though. Anyone with an agenda, a contact with the gov to obtain your search history, and some funding can crowdsource an enraged mob these days and do worse harm than a malicious government ever could.

The masses can have some alarming incomplete opinions depending on how they perceive a query being presented to them. Go ask 100 random people a slight variation of a simple question of what punishment should await people who spread “COVID vaccine lies”, and you’ll see just how swiftly we can rationalize loading people onto cattle cars again.
 
In Europe, it’s customary to take family pictures of 10-year-old girls topless on the beach. In the US, this would be considered child porn and you can go to prison for 20 years if they find a picture like that on your phone.
I’ve seen that occasionally with European visitors in the US. I think they may be looking for specific images, so this may not be an issue. Still, I don’t trust those building the hashes not to include photos that are not cases of abuse or that someone won’t do the equivalent of swatting you by maliciously finding a way to do a photo dump. If the bad guys know about this they won’t use it anyway. There has got to be a better way without invading privacy.
 
  • Like
Reactions: DanTSX
I’ve seen that occasionally with European visitors in the US. I think they may be looking for specific images, so this may not be an issue. Still, I don’t trust those building the hashes not to include photos that are not cases of abuse or that someone won’t do the equivalent of swatting you by maliciously finding a way to do a photo dump. If the bad guys know about this they won’t use it anyway. There has got to be a better way without invading privacy.


They are looking for specific images (or rather their hash)

Child porn creeps pass around collections of existing material that the “authorities” have already documented and have their digital fingerprint on databases.

The system would likely NOT notice new images.

The system is looking for the hashes matching that of known child porn collections from legit “old fashioned police work” busts (I.e. the Genius Bar worker calls the cops after stumbling across very obvious CP on creepy old guy’s computer).


Vacationing Europeans should not worry, but should obviously have enough self awareness to not take similar pictures in the USA, or face some form of public confrontation.




Note that the in the above example, it’s not a stretch to replace the search for the signature of known child exploitation images with a search for hashes of specific pictures of say a compromised Hunter Biden, and have all of those suddenly disappear... next extrapolate this scenario to searching for the signature of pictures with digital currency wallets encoded in them, and things get real fun!





Exciting times, huh?
 
Last edited:
This is the real issue, even before privacy.




So not true. If you have never been falsely accused, you have no idea the horror of being falsely accused of something as sinister as child pornography or sexual abuse. Pray it never happens to you.
They are looking for multiple images which would make it less likely. However I’m still not comfortable with this. It could be weaponized, it could contain photos that fall in to societal norms of some places (Europe), it could lead to a slippery slope of including more types of images, etc
 
I'm not sure what to make of this. It makes me very uneasy.

Why?

I have a four-year-old boy and very often he will spend much of his time without clothes on around the house as small children do (we live in a very warm part of the world in the summer). He often chooses not to wear swim shorts in our pool and is quite happy in his birthday suit eating an ice-lolly at his nanny's home.

These photos are our private family photos. They are for my family only, and as somebody that does not publish a single photo of my child on social media, what does this mean for me, my wife, my son's grandparents?

Some so-called "agent" in another part of the world getting a trigger and able to view these photos and possibly put a mark on my family. (We all use iPhones).

There is something incredibly wrong with this. I'm not sure where to start with it all.
 
No. It doesn't work this way. It's scanning for known child sexual abuse images using hashes to identify those photos. It's not going to flag your personal photo.
Can you quote where the technical papers say this?

My understanding is it's looking for "nude photos". Where are the limits? This isn't simple hash searching. Plus, hashes can be easily manipulated.
 
  • Like
Reactions: -DMN-
Don’t understand the big deal.

This isn’t people at Apple scanning people’s photos, it’s on-board AI that’s looking for matches with an existing database.

Your photo library is already being scanned for faces (People feature) and other points of interest.
False positives are the issue.

Imagine like me, you have photos of your child, naked on your phone. Pretty much every parent I know has this. Kids get naked all the time. In the pools, doing something silly, in the bath, running around the house......it's hard to keep clothes on my boy. It's just the way children are.

You get tagged because there is plenty of "flesh" detected on your phone.

The next thing you know, armed agents are knocking down your door with guns (in the US), to your face and your entire community watches on.

After a short investigation which probably emotionally destroys your soul, you return to your family (probably under the eye of social services as a "precaution"), with your entire neighbourhood still watching you.

"What did he do?"
"Is he a paedo?"
"Does he abuse his child?"
"Oh I heard such and such and such....."

Rumours drift and gossip spreads. (People are generally ******s)

Your boss hears of this and the company feels it's "better for all" you leave your job.

Jobless, the entire neighbourhood spreading gossip, you're being monitored by social for every action you make (the school now knows), your life in ruin.....

But, at the core of it; You did nothing wrong.

You just took private photos of your beloved child and you ended up using an iPhone to do it.

Yup. No big deal.
 
Asking honestly, but does anyone have a good link explaining the link between possession of child porn and child sexual abuse behavior? Every once in a while I see news articles on these "raids" on dozens of adults possesing the porn but they never include any information on how many possessors are predators or if there is any evidence of a causal link between viewing and behavior.
 
No, that's not how it works.

Your photo will be reduced to a single numerical fingerprint. That fingerprint will be compared to a list of fingerprints for known images without any context about your image. This fingerprint isn't a picture of your child bathing, and it's definitely not an image or understanding of genitalia. It t's just a number like 8BA62546-1258-4E90-9096-48EE7365ECAE. Since your photo is not on the list, nothing will happen.

On the other hand (and of course you wouldn't do this, I'm just trying to explain the mechanics here), if you sold that image online to a lot of people and it became a well known image of child pornography, the FBI would eventually add the image to their database. Apple would end up building a fingerprint of the FBI's copy of that image. If that fingerprint still matched your image, it would be flagged. Apple/LEO would be able to look at their copy of the image matching that fingerprint because they acquired the image through another mechanism. In this case, though, some other mechanism got the photo from your computer or phone into the cloud.
Not the FBI, a non-profit advocacy organization outside of any electoral oversight.
 
  • Like
Reactions: TakeshimaIslands
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.