Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
In Europe, it’s customary to take family pictures of 10-year-old girls topless on the beach. In the US, this would be considered child porn and you can go to prison for 20 years if they find a picture like that on your phone.

And unless those photos have been flagged by the US government as child pornography, the family will be just fine.
 
Asking honestly, but does anyone have a good link explaining the link between possession of child porn and child sexual abuse behavior? Every once in a while I see news articles on these "raids" on dozens of adults possesing the porn but they never include any information on how many possessors are predators or if there is any evidence of a causal link between viewing and behavior.

no legal savvy here, but I think the fact that you actually posses it or even look at it is considered a crime. Technically speaking, if you were online and clicked a linked and showed CP you are going to jail if the law is such looking at the pictures is a crime in itself
 
no legal savvy here, but I think the fact that you actually posses it or even look at it is considered a crime. Technically speaking, if you were online and clicked a linked and showed CP you are going to jail if the law is such looking at the pictures is a crime in itself
Right, I'm sure that's the law. I want to understand the relationship between the thought crime (porn) and the actual crime (child sex predation) so I can evaluate the reasonableness of the law.

It's illegal to lynch someone (obviously), but nothing prevents me from Google searching a photo of someone who has been lynched or having it depicted vividly in Hollywood movies. I just want to understand why there is a different standard here. Presumably viewing the material must cause predation, but I've never seen the numbers.
 
This is the worst news in IT/Apple this year so far. One of my main reasons for sticking with Apple has just been eroded, of course it's not as if I would go anywhere else since the others are just as bad. Well someone on Snowden's twitter praised Linux phones but that is a non-starter personally - do they even exist?
 
  • Like
Reactions: MacBH928
“It wouldn’t be an effective scan for child abuse photos if child abusers could disable it.” I can see this argument being made by Apple.
I guess this argument didn’t age well, based on newer articles posted here saying it would be turned off if iCloud photos aren’t used.
 
iOS 15
- Apple will now scan your images searching for evidence of foul play.

iOS 16
- Apple will now listen to your voice mail searching for evidence of foul play.
- Apple will now monitor your browsing history searching for evidence of foul play.

iOS 17
- Apple will now scan your videos searching for evidence of foul play.
- Apple will now monitor your apple credit card searching for evidence of foul play.

iOS 18:
- Apple will now scan your text messages searching for evidence of foul play.
iOS 19:
- Apple will now have Siri listen 24/7 for evidence of illegal activities
- Apple will require that your camera be enabled and recording 24/7 for later review
 
  • Like
Reactions: ssgbryan and cola79
Let us stipulate that child porn is wrong.

It remains that a private corporation is now doing something, at scale, that would require a warrant if done by law enforcement. This is unacceptable, full stop. This is vulnerable to mission creep of unbelievably dystopian proportions.

And if Apple's competitors are also doing it, the only difference is that they will monetize it better than Apple.
I fully agree. Child porn is wrong and illegal. Searching every single persons private images without warrant is also wrong and illegal in many countries. In all honesty I find this terribly 1984. Universally people find child abuse very repulsive and want to take action to prevent it. Therefore people are less likely to fight against an idea of scanning all your private images in order to catch those who abuse children. However, I assume the abusers know about cloud service providers scanning the images so they most likely use other forms of private storage. Therefore, Apple just ends up scanning regular photos and building massive database which can be used to match any particular photo if necessary. First you introduce the feature to fight against child abusers and after a few years that same approach is used to fight against the misfits, the rebels and the troublemakers. The round pegs in the square holes. The ones who see things differently.

All in all, I believe the practise of scanning ones private information on ones private device is highly illegal without search warrant in many countries. In EU service providers can do CSAM on internet communication. However, they absolutely can’t scan all the pictures you have on your phone. Doing so would most likely ban the whole CSAM effort in Europe (law allowing scanning is in effect until 2024) since in that case invasive measures are being used which are illegal.
 
Let's see...iCloud... Apple's servers (or perhaps google's), right?

Why in the world would Apple/Google want child porn on *their* servers?
Terrorism? Screw you, FBI. Kiddie porn? Here are the keys, take them, FBI.

I understand these are different issues technically, but that is the quick take impression.
 
As long as the "friend" isn't a minor child, I don't see the problem between consenting adults, although Dr. Phil may have some questions for both hairless parties.
But will the algorithm see the difference and would you want a human thumbing through photos to determine if you and your partner just like to shave crotch?
 
In the end, the timing of this announcement is really bad for Apple, coming after the NSO Pegasus spyware hack in the news just very recently. Apple--probably after consultations with its own legal team--may delay the implementation of this feature (for now) to avoid a legal challenge from the ACLU and EFF in the USA and European Union legal authorities.
 
This one;
In Europe, it’s customary to take family pictures of 10-year-old girls topless on the beach.
I have a bunch of French friends whose 9-10 year-old girls swim topless at a local neighborhood pool and are being customarily photographed by their parents.

I was born and raised in Russia where pre-puberty girls were always topless at the beach and customarily photographed by their parents.

In these cultures, girl chests are not covered until they develop breasts. Obviously, French, German, Scandinavian, etc. women love going to the beach topless where it’s allowed, but that’s a different story altogether.

That’s how I know.

For most Americans, it’s shocking to see 10-year-old girls topless in a public place. The notion “child porn” immediately pops in one’s head. Even for me it was a little shocking first when I saw the French letting their girls being topless in public. Then I remembered my childhood and realized it was completely normal almost everywhere else besides the US.
 
Last edited:
so if i have a private picture of my own child in a bathtub splashing away it will now flag and some contractor working halfway around the world for pennies gets to view my child in the bathtub, without my permission, and determine if perhaps i am some kind of child abuser? maybe they even keep a "souvenir" of the photos somehow. then i get to talk to a detective and be forever flagged in some database as someone accused or at least investigated for one of the most horrific crimes that exists? what could possibly go wrong?

UPDATE: there is no option not to have this scanning if you use photos or icloud photos. however upon reading more about this it apparently works by encoding your image to a "hash string" which isn't the actual image and then comparing that hash string to a list of KNOWN child abuse images from the internet (so unique images of your children playing in the bath aren't even an issue) and if you have a certain number of exact matches then and only then do they "manually" (ie. a person) look at the images and determine if a crime is a occurring they need to report. they claim there is a less than 1 in a TRILLION chance a flagged account doesn't actually contain significant amounts of true child abuse imagery. i need to read more but perhaps it isn't as creepy as it first sounded to me... but they need to be very, very transparent of what is going on.

The real issue perhaps isn't false flags so much as how this technology could spread to scan for other types of images (political etc.) to flag individuals. i get that seems very "black helicopter" of me when i say but it is apple itself who says they don't build backdoors into iOS because of the chance it could be abused.
If your personal photos of your children are in the database then you just might be a pedophile.
 
This is the first time, I will simply not install the update, sit back and wait. I am NOT letting scan ANYONE my private stuff, just to control whether there is anything unlawful. There IS nothing unlawful here, and we all are flagged as possible liars, that have to be controlled first - again and again...

I am NOT accepting, being flagged as a possible liar per default.

No Apple - simply NO!
 
Basically, no more Apple OS upgrades for me on any of my devices. It’s probably time to go back to Linux again and my next phone will surely be an Android.

Privacy!
Is Android really better on privacy? I have strong doubts.
 
Is Android really better on privacy? I have strong doubts.
Tim Cook: Android already does things like that. We can introduce it as well at Apple. We just make it a bit more sophisticated and sell it as an improvement. That will work then.
Me: I am not on Android, because I trusted Apple - until today. But now I will search for alternatives for my own private and legal stuff. Go to hell. As of today you have lost my confidence, dear Tim.

Apple Stock just dropped - interesting...

I even might think about removing my credit cards from Apple Pay.
And these boarding passes - I might undust my old printer and buy some cartridges and paper sheets...
 
Last edited:
  • Like
Reactions: cola79 and 09872738
It’s weird that I love the show “Person of Interest” and need to rewatch it but I don’t love how it happens in real life.
I agree with the hashing database for child porn, but the slope is getting slippery, and if we have another Trump, I can guarantee he’d use this feature solely to yell at his competitors about their banal lives, or worse, blackmail them. Apple doesn’t seem to have morals when it comes to anything regarding online safety/security, so we will see how it goes.
 
  • Like
Reactions: 09872738
It’s weird that I love the show “Person of Interest” and need to rewatch it but I don’t love how it happens in real life.
I agree with the hashing database for child porn, but the slope is getting slippery, and if we have another Trump, I can guarantee he’d use this feature solely to yell at his competitors about their banal lives, or worse, blackmail them. Apple doesn’t seem to have morals when it comes to anything regarding online safety/security, so we will see how it goes.
Apple doesn't have morals when it comes to anything regarding online safety/security? Think this proves the opposite of that.
 
Apple doesn't have morals when it comes to anything regarding online safety/security? Think this proves the opposite of that.
Not nessecarily, since while they’re doing the right thing in the US (albeit in a way that can be easily abused if not going through a human scanning system at some point to prevent false positives) they aren’t in China. Anything for the highest $, right?

edit:


 
Not nessecarily, since while they’re doing the right thing in the US (albeit in a way that can be easily abused if not going through a human scanning system at some point to prevent false positives) they aren’t in China. Anything for the highest $, right?

edit:


I was just replying to your assertion, which is false, and not building a story around it. If you want to continue to qualify what you really meant, maybe go back and edit the original post.
 
I was just replying to your assertion, which is false, and not building a story around it. If you want to continue to qualify what you really meant, maybe go back and edit the original post.
Except I meant what I said and elaborated specifically for your confused rhetoric in this instance. If you wanted clarification all you had to was ask lol, no need to snark. And it’s not objectively false that Apple sells private information to the highest bidder in China while maintaining hypocrisy in America, no? Or did you think we’re not special snowflakes because the hash information isn’t being used to propagate for more simple crimes YET? I agree with the technology, but we really need to look at what it WILL be used for with a more critical lens in regards to a surveillance state, re:China and potentially even America. Hope this clarifies!
 
Except I meant what I said and elaborated specifically for your confused rhetoric in this instance. If you wanted clarification all you had to was ask lol, no need to snark. And it’s not objectively false that Apple sells private information to the highest bidder in China while maintaining hypocrisy in America, no? Or did you think we’re not special snowflakes because the hash information isn’t being used to propagate for more simple crimes YET? I agree with the technology, but we really need to look at what it WILL be used for with a more critical lens in regards to a surveillance state, re:China and potentially even America. Hope this clarifies!
There was no confused rhetoric or snark on this end. I was simply pointing out that this "revelation", no matter how upsetting it is to people, goes against the assertion apple has no morals. This "action" by Apple shows it does have morals; even if implemented in some draconian 1984 way that will manage to get many people upset.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.