Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Again, correct me if I'm wrong (but only with facts if possible), but aren't the images they are scanning for known images from CSAM? So it's not scanning for any photos, It's matching photos against known photos. A little bit of a distinction there I think.

FROM APPLE:
the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

--------

Again, I'm not arguing for or against this system. In fact I'm more against than I am for it. And this was always the outcome. they were always going to postpone it after the initial reaction and concerns from privacy groups. But it will be back in some form because... "Online child exploitation is prevalent on the dark web – with over 25 million images of child abuse being investigated by NCMEC annually."

This is why Apple and the other tech giants won't give up on this.
So I have no facts for you specifically, however I would be suspicious of the number of CSAM images on the dark web. What exactly defines CSAM? In a broad sense it could be any image of a naked person under 18yo, viewed by someone 18 or older. I’m dubious about viewing these as a crime, but hey, that’s the law. So who are the primary producers of these photos? Most are probably kids themselves, innocents, or doting parents, again innocents. Now obviously there are a few really despicable people in the world who create harder core content, including parents. Those people should be locked up. But what Apple is doing won’t do anything about the worst producers of CSAM. It’s just a lot of technical jargon to create a backdoor using CSAM as the excuse. No children will be saved, but there will be the backdoor into iOS that the NSA has been craving for years.
 
Sell your Apple stock if have them.

Counter to this one single point - if Apple's going to keep making money anyway, at least you can profit off of their stock price while not giving them a dime elsewhere. I have a friend who hates Facebook, hates that his wife is on Facebook, but bought Facebook stock a few years ago and loves the fact that he's made a killing off of them while they're not actually making anything off of him.

;)
 
  • Like
Reactions: smoking monkey
I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
They were relieved as soon as this became known. All they had to do was opt out of iCloud. This was another example of a 'hammer' seeing EVERYTHING as a 'nail'. This was very poorly thought out. Astonishingly poorly thought out.
 
So I have no facts for you specifically, however I would be suspicious of the number of CSAM images on the dark web. What exactly defines CSAM? In a broad sense it could be any image of a naked person under 18yo, viewed by someone 18 or older. I’m dubious about viewing these as a crime, but hey, that’s the law. So who are the primary producers of these photos? Most are probably kids themselves, innocents, or doting parents, again innocents. Now obviously there are a few really despicable people in the world who create harder core content, including parents. Those people should be locked up. But what Apple is doing won’t do anything about the worst producers of CSAM. It’s just a lot of technical jargon to create a backdoor using CSAM as the excuse. No children will be saved, but there will be the backdoor into iOS that the NSA has been craving for years.
So you believe Apple is only doing this to create a backdoor!?!?
 
They were relieved as soon as this became known. All they had to do was opt out of iCloud. This was another example of a 'hammer' seeing EVERYTHING as a 'nail'. This was very poorly thought out. Astonishingly poorly thought out.

This. What Apple did, albeit unintentionally (hopefully anyway), was bring this whole issue into the spotlight, causing CSAM consumers/distributors everywhere to realize how much this goes on with other services, and how to easily avoid it on Apple.
 
Last edited:
  • Like
Reactions: MuppetGate
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
Right. Because I don't put a camera on my front door and allow the police to have access to it I'm responsible for any neighborhood break-ins. 1. You 'argument ' reeks of the the hysteria you imply of others and is a good example of the 'slippery slope' type of thinking so many are wary of....and 2. I suggest some readings in Logic: Mathematical or Philosophical. Either might lead to an improvement in your reasoning abilities.
 
For me, the damage has already been done... I thought Apple was a champion for The User, and because of that I was happy to overlook the walled-garden and in some cases the lagging behind other companies' products in terms of innovation or feature-set. For any given tech I'm interested in, for the last few years I have either only gotten the Apple version or waited for Apple to launch something, and haven't even really looked at reviews of anything else.

Now Apple has destroyed the illusion that they are working to put me in control and let me safely use my own devices. I guess I should have already known, but before this it was pretty easy to forget.

Funny/difficult to now tell myself that I don't actually need the iPhone 13... I don't need the new Apple Watch... There are other AR/VR things I can get into now instead of waiting... I can look for HomeKit alternatives instead of sinking more dollars into that ecosystem.

Sigh.
Why not go a step further in your reasoning (which I applaud, by the way)? You don't really need any smart device. You don't really need a smart watch, either.

When you get right down to it, we don't need any of that stuff at all.
 
  • Like
Reactions: smoking monkey
Really? Are you sure? Correct me if I'm wrong though, but with the actual facts, not what you think.

Once a 30 match via hashes is made on your iphone nothing will happen unless you then upload those photos to icloud. Once uploaded those images will again be checked via hashes and if 30 or more matches are present then a human will check the photos to see if they match up.

I'm not stating if this is right or wrong, I just think the details are important and we should be getting them right.
You are adding details I glossed over, but are essentially correct. After a 30 hash match on your iPhone, if iCloud photo sharing is ON, a notification & an image from that iPhone will be sent to Apple for a person to review, and notify authorities if that person reviewing the photo doesn’t like what they see. On the other hand if iCloud photo sharing is disabled, nothing is sent to Apple to notice or review.
 
Great news and a small victory for the "screeching voices of the minority"!

My fear is they just want the iPhone 13 launch to happen without any bad press or having to launch it with 14.X.
Thank you for re-airing her atrocious quote. Showed so much of what she might think of those who disagree not with the problem but the method of solving the problem. Not the type of 'absolutist' I would want anywhere near any of my constitutional rights.
 
So you believe Apple is only doing this to create a backdoor!?!?
No, I believe Apple is being forced to create a backdoor. CSAM is just the excuse. On phone scanning can be expanded to literally cover anything. In the USA, re: terrorist threats, pictures of your massive gun stockpiles? Muslim countries, photos of women not in burkas? If it can scan images, text is a breeze. Apple has been a champion of privacy. So why suddenly is naked images of children so important to eradicate now? It’s important to listen to what Apple is saying. But even more important is what Apple is NOT saying... If Apple was totally on board with this we’d have know years ago. It wouldn’t be a shocking 180° turn around from their core value of privacy, that they heavily advertise worldwide. Once that hash scanning has been installed into iOS, the NSA can get anything on any iPhone anywhere. & the NSA doesn’t talk about all the once illegal/still unconstitutional things they do.
 
You are adding details I glossed over, but are essentially correct. After a 30 hash match on your iPhone, if iCloud photo sharing is ON, a notification & an image from that iPhone will be sent to Apple for a person to review, and notify authorities if that person reviewing the photo doesn’t like what they see. On the other hand if iCloud photo sharing is disabled, nothing is sent to Apple to notice or review.
Sorry. There was no glossing over. Your info was factually wrong.
And I would even ask about the bolded section. I believe you may be incorrect about the procedure there as well in terms of the photos being sent from your iPhone to Apple.

FROM APPLE:
Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC.

------------

It does not mention it accesses the photos on your phone. Please provide literature where that is clearly stated, otherwise you are making it sound like Apple is directly accessing your phone when that is seemingly not the case.
 
Again, correct me if I'm wrong (but only with facts if possible), but aren't the images they are scanning for known images from CSAM? So it's not scanning for any photos, It's matching photos against known photos. A little bit of a distinction there I think.

FROM APPLE:
the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

--------

Again, I'm not arguing for or against this system. In fact I'm more against than I am for it. And this was always the outcome. they were always going to postpone it after the initial reaction and concerns from privacy groups. But it will be back in some form because... "Online child exploitation is prevalent on the dark web – with over 25 million images of child abuse being investigated by NCMEC annually."

This is why Apple and the other tech giants won't give up on this.
No, they are scanning for manipulated images of known CSAM, it's scanning for pictures that "seems to be these CSAM pictures". Which means your family photos can be false positives. That's why they have the 30 picture threshold, the more pictures you have in the iCloud , the more likely you'll end up getting false flagged.

Once you've flagged and your family photos are getting reviewed, pray it's something that a American deems cultural appropriate in their country or else...
 
Last edited:
No, they are scanning for manipulated images of known CSAM, it's scanning for pictures that "seems to be these CSAM pictures". Which means your family photos can be false positives. That's why they have the 30 picture threshold, the more pictures you have in the iCloud , the more likely you'll end up getting false flagged.
Yeah. that's what I said - Known photos. Plus that's what is written in the text I quoted from Apple.

From Apple:
The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

But yes, I think it's very complicated.
 
Once you've flagged and your family photos are getting reviewed, pray it's something that a American deems cultural appropriate in their country or else...

They're not supposed to be making judgment calls once it hits human review - they're supposed to be confirming whether or not it's a true match to the CSAM hash it was linked to... If it's not, it's supposed to be flagged as a false positive and not count against you.
 
  • Like
Reactions: PC_tech
It’s all fun and games when they’re coming for the pedophiles, because I’m not a pedophile.

It’s all fun and games when they’re coming for the atheists, because I’m not an atheist.

It’s all fun and games when they’re coming for the foreigners, because I’m not a foreigner.

It’s no longer fun and games, because I said something they didn’t like, and now they’re coming for me.
 
And it's not true for me.
The argument started with about people will not buy the new phone because of this. (Assuming some percentage of people will not buy the phone for that matter. Not necessary 100%.)
The post I quoted said that is not true. (Argued that is not true, which means 0% of people will be deterred by the matter.)
I responded by stating I will not buy the new phone due to this. (Meaning not 0% of people.)
The argument should have been over. 🤔
 
Last edited:
I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.

No child predators are uploading their illegal content to iCloud, let’s be serious. This feature is just a huge misstep and slippery slope to privacy for everyone ELSE. They were never going to catch child predators with this feature…
 
Apple can’t do that. Because if found out… Apple can get into serious legal trouble with the consumers.

Just waiting on Tim to cancel this CSAM, waiting for him to read all these comments. People are against this feature. That’s the bottom line. 🥺

Let’s get this CSAM to shut down.

Protect your FREEDOM, PRIVACY & RIGHTS.

or should I say…

Protect your WIFE, KIDS & FAMILY.


View attachment 1826750
Commissar Cook saying to himself: "Time for my nappy".
 
Intentions of applying CSAM photo scanning are good and not being a sexual predator myself I support the concept however my interactions with a wide range of society and having case managed a lot people over years I believe the worst predators are highly intelligent people that are capable of avoiding being caught. I can foresee that Governments will apply and or regulate this technology (if not already covertly) for the 'protection of children' and whatever else they want to monitor of their adult citizens. This level of technology did not exist when I was a growing up. I was never interfered with as a child and a major factor is because of loving parents and grand parents protected and subtly diverted me away of such dangers.
 
Last edited:
  • Like
Reactions: Mendota and BurgDog
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.