Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
F
Why not? Do you have inside info into Apple’s hardware & software? Even if your suspicions are true, with iCloud photo sharing disabled, NOTHING is ever sent to Apple. No person at Apple can view the images on your devices. So no person at Apple can send the authorities after you. Apple told its customers how to disable this “feature” probably pissing off people in very high places.
Do YOU have inside info? No? So you assume.....
 
  • Like
Reactions: crymimefireworks
I think if you think of it as a jurist you might be hard pressed to convict someone on cloud content alone, it seems to me any lawyer could create a reasonable doubt, however once a tech company reports it you would think it simply creates probable cause to get a search warrant…. It seems likely to me that if someone were guilty they would have other evidence discovered in their Home….so if law enforcement does their job I imagine the conviction rate would be high….. but if they find nothing during the search I think that would raise red flags that something else is going on and they were falsely accused….. just my opinion….. it seems to me the Apple hash thing may have made it easier to prosecute solely on the iCloud data due to it being directly tied back to the device which kinda suggest the government was involved in this…. But that just speculation and devolves in to the conspiracy thing quickly.
This is exactly it. Anybody assuming that the cops are just going to go and arrest someone based solely on a report from Apple has no idea how law enforcement actually operates when it comes to these things. It's a trigger to dig deeper — a smoking gun, if you will — but not an automatic determination of guilt.

To be fair, the possibility of corrupt cops means there's a slim chance somebody could be caught up in a web of other problems, but the probablility of being identified by Apple's CSAM detection systems is ridiculously slim in the first place. Having 30 or more known CSAM images in your iCloud Photo Library — that is, those that match the database of stuff already circulating among child predators on the dark web — makes you a pretty likely suspect to begin with.
 
I respect this take a lot. Apple assumed I'm a criminal after being a lifelong user. That alone is enough for me to never user their devices again.

You know you don’t have to show a store your bags. Unless it’s like a Costco where your membership agreement says they will go over your cart before you leave. That at least is an agreement between you and the store.

On topic now, I don’t think Apple was accusing us of being criminals, they saw a problem and tried to fix it the best way they thought they could. Apple really doesn’t want to be known as the safe haven for kiddie porn. Unfortunately, there isn’t a great solution; I also think they were trying to appease the government and try and put one more hole in the “Apple should put in a back door for cops” argument.

All this being said, I’m still 100% against the CSAM hash technology. My photos should be accessible by me alone. I only am pointing out that I see where they were trying to come from.
 
a couple of years ago, I remember getting my bag searched when I left a retail clothes store. I have never stolen anything in my life.

sure they could check my bag ..and indeed found nothing...but that search alone was enough for me to NEVER return to that store

...or to put it more plainly, the store thought that I might be a thief.

that presumption was offensive....as is the CSAM photo scanning on my device.

good luck Apple if you want to treat your customers that way.
Yeah it's a harsh policy. It's not that the store thinks you're a thief it's that the store is participating in highly aggressive risk management. But from the customers' POV it def feels like they are treating you like a thief. Perception is king. This is what is happening with Apple IMO.

[In relation to your story about store policy (apart from places like Costco) - Sure they can ask to search your bag, but if you refuse they can't do anything except tell you to leave their store and refuse you entry next time! If it's not a store you go to often, who cares! Usually they are just young kids asking anyway, so they really don't care.]
 
I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
Good. Have the authorities who get paid by your taxes to do just that, hunt these predators down and bring them to justice without having every single one of billion devices scanning through peoples album for abuse content.
 
What?

You are equating child abuse with being a foreigner?

Come for the pedophiles! Arrest them! But with warrants and due process, not by violating rights of everyone in the process.
If that is what you took from my post then no amount of explaining will help you understand what it means. Or rather, I just don't feel like spending the energy trying to clear it up for you.
 
I think I have to disagree.

A parents job is to get a kid ready for the world. By age 8 or so, a child should be trusted enough to access the internet on their own. Prior to this, there should have been conversations with the child about ramifications of doing bad things on the internet, and how things we do now can affect you years or decades down the road.

But just like a parent may tell a child not to play in certain parts of town because they are dangerous, kids will sometimes break the rules and still play. Just because you allow your child to be on the internet, doesn’t mean that you don’t want to know if they are doing things that are wrong (which they might not see as potentially harmful).
Here is what, I think, you're failing to understand. The internet is a virtual gateway to real world--and psychological-- harm. There are things he can see that will damage his mind. There are people out there, ready to pray on him. And as a parent, it's your job to protect him from that sort of thing. Because once he sees it, or once he is prayed on, it's too late. The damage is done. There is no fixing it. No going back.
 
  • Like
Reactions: baypharm
iCloud is still needed. No iCloud, no hashing, no cops. If Apple wants to scan the content that’s being uploaded to their servers, then they can. I won’t ditch my iPhone over this non-issue.
While this is how this "feature" is described, I just don't want this sort of spyware on a device I paid for.
The Pegasus hack debacle has shown that iOS isn't necessarily secure.
Apple want's to prevent CSAM loaded to iCloud, so they should scan in iCloud – and only there.
 
This is exactly it. Anybody assuming that the cops are just going to go and arrest someone based solely on a report from Apple has no idea how law enforcement actually operates when it comes to these things. It's a trigger to dig deeper — a smoking gun, if you will — but not an automatic determination of guilt.

To be fair, the possibility of corrupt cops means there's a slim chance somebody could be caught up in a web of other problems, but the probablility of being identified by Apple's CSAM detection systems is ridiculously slim in the first place. Having 30 or more known CSAM images in your iCloud Photo Library — that is, those that match the database of stuff already circulating among child predators on the dark web — makes you a pretty likely suspect to begin with.
It is not known images, but images that look like known CSAM images. The chance of getting false flagged is only 1:3333 higher than winning the power ball lottery and that's before
any one trying to send you malicious crafted false positive images.
 
While this is how this "feature" is described, I just don't want this sort of spyware on a device I paid for.
The Pegasus hack debacle has shown that iOS isn't necessarily secure.
Apple want's to prevent CSAM loaded to iCloud, so they should scan in iCloud – and only there.
But the '"feature"' has been designed with the presumption that iOS is not secure and the device could have been breached. That is why the hash matching happens in secret and the results are never known to iOS. Why would a Pegasus-style hack bother trying to extract minuscule amount of information the CSAM check reveals, when they would have access to the actual real data on the device? Hash matching is absolutely trivial when you aren't worried about privacy - a Pegasus 2 could just ignore Apple's CSAM and implement its own hash matching with its own list of target images, far more easily than trying to piggy-back what Apple is doing.
 
It is not known images, but images that look like known CSAM images.
Where did you read/learn that it is images that looked like CSAM images? All published information from Apple said that the hashes are generated from images in the CSAM database of images, at least that's how I understood them. The hashing algorithm used has the intelligence to generate the same hash even if the image is altered (slightly?), but it should the same image, not an image the looked similar. I think that's why Apple has the confidence to claim a 1 in a trillion chance of any account being wrongly flagged as containing CSAM materials in iCloud Photo.

I think the key word here is CSAM materials, not materials that looks like CSAM materials.
 
Why would a Pegasus-style hack bother trying to extract minuscule amount of information the CSAM check reveals, when they would have access to the actual real data on the device?
If someone is being targeted with a Pegasus style attack, this person must be someone important. CSAM is probably very low in the list of things to worry about. Basically every single bit of information in this person's device will be extracted and sent to a server somewhere.
 
Hey, Abazigal, now that you're an official 'contributor' to MacRumors, is it good form to make posts like this?
Unless the idea is your 'contribution' is supposed to be passive-agressive responses to people.... hmmm

@crymimefireworks

and just like that….you gave away your age. And your frame of view.
 
and just like that….you gave away your age. And your frame of view.
Help me out, is that supposed to be some kind of burn? If so, I dont get it.

All I was saying is that now that he's in a position where he (by association) 'represents' macrumors staff, he should be more careful about how he responds to people since many of us will interpret his comments as representing Macrumors.

So, when he makes a post like that, a post that effectively has no other intent than to be confrontational, that reflects on Macrumors.
 
Last edited:
Oh my god, how stupid is apple's leadership. Do they really think that real child abusers will continue to use iCloud Photos after rollout? This will just be a massive pain in the ass for us, regular consumers.

It's like adding additional regulations when buying weapons legally, are you stupid? Gangsters don't buy weapons legally, they don't care about your rules.

The same will apply to CSAM detection in iCloud Photos.

I swear, sometimes people have heads so far in their asses, they almost see the light through their mouth...
 
Oh my god, how stupid is apple's leadership. Do they really think that real child abusers will continue to use iCloud Photos after rollout? This will just be a massive pain in the ass for us, regular consumers.

It's like adding additional regulations when buying weapons legally, are you stupid? Gangsters don't buy weapons legally, they don't care about your rules.

The same will apply to CSAM detection in iCloud Photos.

I swear, sometimes people have heads so far in their asses, they almost see the light through their mouth...
I’m guessing in their heads is this story. Debate the right or wrong of each all you like but this fits Apple’s MO of the last few years. If you believed all four of these what would you do?

1. We have an upcoming photo storage product which is the best in its class (we’re super excited) for ease-of-use, reliability, security and privacy. Nobody can snoop on users’ photos.
2. Because of what it offers, it’s likely to attract a ton of pedos, and political heat about harbouring pedos.
3. We feel a responsibility not to have our world-changing products facilitate child abuse.
4. We’ve designed a clever system to deny pedos access to our service without having any practical effect on legit users’ privacy.
 
  • Like
Reactions: MarcoPohlo
I’m guessing in their heads is this story. Debate the right or wrong of each all you like but this fits Apple’s MO of the last few years. If you believed all four of these what would you do?

1. We have an upcoming photo storage product which is the best in its class (we’re super excited) for ease-of-use, reliability, security and privacy. Nobody can snoop on users’ photos.
2. Because of what it offers, it’s likely to attract a ton of pedos, and political heat about harbouring pedos.
3. We feel a responsibility not to have our world-changing products facilitate child abuse.
4. We’ve designed a clever system to deny pedos access to our service without having any practical effect on legit users’ privacy.
Unfortunately #4 is easily gotten around with Apple’s design.
 
It is not known images, but images that look like known CSAM images. The chance of getting false flagged is only 1:3333 higher than winning the power ball lottery
It's definitely known CSAM images, not those that merely look visually similar. That would most definitely be a bridge too far.

Some folks are misunderstanding the fact that the hashing algorithm allows images to still be matched if they're slightly modified (e.g. cropped or have had colour/greyscale filters applied), but the content of the image still has to be identical to a known CSAM entry. This is not analytical machine learning at work, but rather the same kind of hashing technology that's been used to validate files for decades. Change one byte or actual content and you get a completely different hash that won't match.

It's also worth noting that it takes quite a while for images to actually end up in the CSAM database, and they have to be circulating pretty widely before that happens. The NCMEC doesn't just add every photo that gets picked up in law enforcement investigations.

and that's before any one trying to send you malicious crafted false positive images.
The odds of that being a real problem are even more slim — even if somebody was very deliberately trying to "frame" you specifically, there are a lot of things that would have to happen before you'd be at any risk of being flagged — most of which is entirely under your control:

  1. The CSAM Detection only runs on photos being uploaded to iCloud Photo Library. It doesn't do anything with photos you receive by messages, email, etc. In other words, you'd have to deliberately add these photos to your Photos library before they'd be scanned at all.
  2. You'd need to add at least 30 of these "false positive" images to your photo library before it would trigger the CSAM Detection.
  3. Due to the nature of hash collisions, it would be extremely difficult to craft false positive images that actually looked like legitimate CSAM. They could just as easily be photos of a tractor.
  4. Besides, if they did look like CSAM, why would anybody actually save them to their photo library in the first place?
  5. If they don't look like CSAM, then Apple's human review process is going to see a bunch of irrelevant photos and discard the report as an obvious false positive.
Even if somebody very cleverly started crafting "false positive" photos that people are likely to collect and posting them online, step 5 would stop the entire thing dead in its tracks.... and even if Apple was totally asleep at the wheel and reported the accounts as containing CSAM, law enforcement isn't going to do anything based on a bunch of innocuous photos that are very obviously not CSAM.
 
Just to be fair, we have at least two simple options:

a) stop using iCloud Photos, or
b) not upgrade to iOS 15

However, if we think about it a bit more, the system Apple was proposing is not very fit for purpose, as for an alarm to be triggered, it needs to find at least 30 CSAM hashed images. What if a predator only has 25 of those? Then it was not even mentioned at WWDC, so something big must have happened in Apple-land between June and August. 🤷🏻‍♂️
Why should we have to do anything when it is Apple trying to impose this on us? But let's humour this for a minute sure stop using iCloud photos, that would be inconvenient in varying degrees but not upgrading to iOS 15 isn't a viable solution as we allow iOS 15 to have this then every subsequent version, 16, 17 and beyond will also have it. So at some point you'll face the same dilemma, either upgrade to the new OS or move on from Apple.

Everyone can keep arguing about what this can and can't do... this fight is not about what the software can and can't do, it is about telling Apple that under no circumstances will we tolerate any kind of spyware or the like, known or unknown in our OS.
 
  • Like
Reactions: BurgDog
it surprises me people are still missing the point, we don’t want the hash database on our devices, it’s just spyware no matter Whether it’s looking for stuff we agree with or not…. It’s just a principle thing, people keep having discussions about other stuff and Apple trust etc, it’s not the issue. Apple will move forward after they figure a way to do the same thing without our devices being in the loop… and that is something we will all support
They aren't missing the point, they are completely ignoring it because they just don't care.
 
I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
I'd imagine the tech savvy predators wouldn't be storing their albums of filth in the cloud anyway. So regardless of what Apple was planning, it wouldn't be an end to anything.

My understanding from what I've read is that these people tend to store & share images and videos with other like-minded individuals on websites, so the real effort needs to go into taking these sites down. But as soon as you shut one down, another 10 pop up. Just like the efforts to combat piracy. The criminals are always way ahead of authorities.

But how do you police the transmission of every photo & video on the internet, particularly on encrypted websites? You can't.
 
  • Like
Reactions: dk001
Help me out, is that supposed to be some kind of burn? If so, I dont get it.

All I was saying is that now that he's in a position where he (by association) 'represents' macrumors staff, he should be more careful about how he responds to people since many of us will interpret his comments as representing Macrumors.

So, when he makes a post like that, a post that effectively has no other intent than to be confrontational, that reflects on Macrumors.
A "contributor" to Macrumors is simply the tag given to someone who has given a minimum of $25 to the site. It in no way implies that the person represents Macrumors. I am a contributor and do not speak for Macrumors. In fact I disagree with certain methods of operation but I use the site and feel that I should help them pay the bills.

In the particular case you are referring to, my comment to the poster, made in good humour, was that he seemed to be responding as if he worked for Apple. Anyway, this is off topic so enough said.
 
  • Like
Reactions: PC_tech
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.