Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Just remember it is partial content scanning. What does that mean? It means in this case that the code is running an algorithm that looks for specific similarities in a subset of the total picture matching a code or set of codes.

Now why does that sound just like facial recognition. Think before your react. I did not say it WAS facial recognition, I said it was nearly the same algorithm. Now think about how well Apple's facial recognition does NOT work.

1) This is so near facial recognition that there is no way to stop it from being applied to that end without users never being aware. In fact, I'll bet the government is forcing Apple to do this and Apple is trying to make a positive out of it before they get caught.

Next it will be, "we are only looking for terrorists." And there is the problem, a terrorist is anyone the government does not currently like. Why? Because the government has discovered (thanks to President Bush and the Patriot Act) they don't have to follow any laws as long as the person in question is classified as a terrorist. A terrorist can now be held without due process and without human rights.

2) Remember how well Apple's facial recognition does NOT work. Sure its fine for families that don't matter if it is wrong.

But you really want SWAT showing up at people's home based on Apple's picture recognition AI (and I know someone is supposed to review so don't get me started on the problems with that).

Come on, get real. Apple's spell checker is awful. Apple's Siri AI is awful. Apple is great at providing tools, but not in the actual end implementation.

To get that right Apple will have to port over code from the CIA, FBI, NSA, etc. If they have not already.
What the hell are you talking about?
 
The software scans your cars and hills, but doesn't call home to Apple.

Now, if the software notices a unique identifier (or enough unique identifiers) of known child porn, it sends a message to Apple "hey, better have a real person take a look at these."

What I still haven’t been able to find anywhere are the details of what the human review process entails. It’s illegal for anyone other than NCMEC to have those images. So how does Apple review then? If an adult nude image is falsely flagged, will Apple assume it’s a true positive? Keep in mind some 25 year olds look 15. And some 15 year olds look like they are 25.
 
The shilling being done by some here is absolutely pathetic.

On device scanning is an invasion of privacy. Period.
Yup, and the majority of those doing so have an established history of railing against Samsung and Google for "invading their privacies," but magically when Apple does so it's apparently not just ok but actually a good thing.
"Hypocritical sycophant" is not a good look on most people.

More interesting is the way LGBTQ+ groups are directly targeting Tim with their statements in regards to how the nude scanner in iMessages could put young members in that group at risk in their own homes. Apple has enjoyed a place of esteem among those communities, largely due to the efforts of Cook himself, and it would seem there is a tide shift coming his way if he doesn't address this angle directly.
 
Thing is, as sick as CP is, its just another thing that is illegal. There are lots of things in this world that are illegal. Animal abuse is illegal, dealing drugs is illegal, trespassing is illegal...Why aren't we scanning for that type of stuff too if we're just looking at doing the morally right thing here? I can understand the suspicion.

Plus NCMEC is not new. Have things improved a lot? Do people think Apple is just some safe haven for these sick people?
 
I'm not convinced by your two arguments - you should be able to decide which applications download images directly to your photo roll and you'll need to bring me evidence about how an iPhone could be easily hacked than iCloud.

and your first point would be either bad if it's client or server side anyway.


my real issue with this outrage is : if client side scanning is not ok, server side scanning is not ok either (less evil is still evil). both are fugly if they can't be peer reviewed in any way.

I understand people being skeptical to the least with Apple's approach but I'm yet to find voices getting angry about the status quo that is server side scanning.

You do understand that iCloud’s web interface can be hacked into and photos placed there which automatically propagate to all iCloud devices synced to it?
 
Last edited:
  • Like
Reactions: So@So@So
As a member of the LGBTQ community, I am dismayed by opponents of the parental controls in iMessage because of a presumed disparate impact on homosexual youth. The parental notification of potentially explicit imagery views on kids devices are for only kids under 13yo. Parents of all kids, including LGBT kids, deserve to protect these young kids from potentially being targeted by child predators.

Further, LGBT kids in repressive households or communities are MORE susceptible from grooming by child predators. We shouldn't limit parental controls because there are some parents with unhealthy parenting skills. These parents could just take their kids devices and check all the messages regardless.
As a former 13yo gay kid, I would be horrified that my parents would be notified of anything I wanted to research in regards to my developing sexuality in private, especially considering such a disclosure would have easily resulted in me being sent to a correctional camp.
 
It does seem hypocritical.

A web hosting company wants to be immune from what their customers post, or host, and now Apple wants to 'police' their iCloud member's data. It does seem like an odd thing to want to do. So *should* hosting companies be held liable for their content? How can a hosting company 'police' all of their customers and eliminate 'objectionable content', and who decides what's objectionable. Seeing multiple angles on this doesn't help...

I think this is really a good point.

1) If Apple wants to start policing their devices, then I think Apple should be held liable and responsible when someone uses their devices to commit a crime.

2) If Apple does not police their devices, then Apple should not be held liable or responsible when someone uses their devices to commit a crime.

There is already law to support this and it is why ISPs and Telcos are not held liable for their content. The discussion is before congress to take away the protection from FaceBook and Twitter since they are censoring content.
 
  • Like
Reactions: eltoslightfoot
As a former 13yo gay kid, I would be horrified that my parents would be notified of anything I wanted to research in regards to my developing sexuality in private, especially considering such a disclosure would have easily resulted in me being sent to a correctional camp.
Take that up with your non-understanding parents. Also it’ll only flag to your parents if you received in iMessage. Do your research in safari and don’t save the pictures…
 
At this point, someone in Apple must have been regretting the decision to publish this feature, which understandably creates such a big wave of backlash.
Next time Apple doing this, I am not sure if they will just keep their mouth shut and deny any allegation like they always did regarding major hardware failures (antenna, battery, bend etc).

Cats out of the bag now. I’m sure people are going to try to reverse engineer iOS (like what has already been done here) to see if something else sneaks out.
 
As a former 13yo gay kid, I would be horrified that my parents would be notified of anything I wanted to research in regards to my developing sexuality in private, especially considering such a disclosure would have easily resulted in me being sent to a correctional camp.
That's something I never thought of, and it's a very good point against reporting txt's to parents. A lot of parents would understand, a lot wouldn't.
 
  • Like
Reactions: ChromeAce
It's on airport (their) property.


Ditto, though I would hope a beaten wife would be reported by anyone. It's not something you have to scan for!


Again, their property, their responsibility to follow law.


Their property, again.

My phone, *my* property.

Exactly! My phone, my property. Their property, their responsibility.
Which is why the scans only occur when uploading to iCloud photos.. to protect apples property..
 
As a member of the LGBTQ community, I am dismayed by opponents of the parental controls in iMessage because of a presumed disparate impact on homosexual youth. The parental notification of potentially explicit imagery views on kids devices are for only kids under 13yo. Parents of all kids, including LGBT kids, deserve to protect these young kids from potentially being targeted by child predators.

Further, LGBT kids in repressive households or communities are MORE susceptible from grooming by child predators. We shouldn't limit parental controls because there are some parents with unhealthy parenting skills. These parents could just take their kids devices and check all the messages regardless.

People that keep bringing this up, let me ask something. Do you consider being part of LGBTQ gives kids a free pass to exchange nude images?
 
You do understand that iCloud’s web interface can be hacked into and photos placed there which automatically propagate to all iCloud devices synced to it?
so you're saying that iCloud is easier to hack than an iPhone, the exact opposite of what the other was talking about. quote them, not me
 
Regardless of the outcome, I will never feel the same about security with Apple products as I have previously. They can do anything thing they want in the cloud, but not on my phone.

Imagine the whine-fest of epic proportions that would ensue if they did.
 
Which is why the scans only occur when uploading to iCloud photos.. to protect apples property..
For now... (until a govt asks them to change that.)

Part of looking out for future problems is to not get blindsided when it happens, and given this subject, it will happen.


They could protect their property on their property...
 
Apple has been willing to admit some mistakes. I hope they can admit this one. I get the good intention, but I don't understand how they couldn't foresee the negative consequences of something like this.
 
  • Like
Reactions: ececlv
So do people think Apple has something to gain by implementing this, or do they think Apple are being forced to do this by some Government agency, because I'm genuinely puzzled by why they would go to all this trouble if they didn't think it was the right thing to do.
It sounds like Apple was planning on releasing this product for awhile know since the CSAM coding has been on their software since iOS 14.3. It is sounding like Apple does not want to have to backtrack on this project.
 
For now... (until a govt asks them to change that.)

Part of looking out for future problems is to not get blindsided when it happens, and given this subject, it will happen.


They could protect their property on their property...
But the same can be said for server side scans, it could be amended to scan whatever they wish.
Either way is still at that mercy. So why are we arguing?
At least client side you can decide not to update your iOS, especially as apple is providing the opportunity to stay on ios14 if you desire
 
It sounds like Apple was planning on releasing this product for awhile know since the CSAM coding has been on their software since iOS 14.3. It is sounding like Apple does not want to have to backtrack on this project.
They’ve confirmed it’s a different hashing mechanism to what’s in iOS 14 right now… not quite the same thing
 
They’ve confirmed it’s a different hashing mechanism to what’s in iOS 14 right now… not quite the same thing
The code will be modified with the latest version but this code shows Apple has been planning on releasing this type of CSAM scanning software for awhile.
 
  • Like
Reactions: ececlv
As a former 13yo gay kid, I would be horrified that my parents would be notified of anything I wanted to research in regards to my developing sexuality in private, especially considering such a disclosure would have easily resulted in me being sent to a correctional camp.
This feature does not limit young people searching for information online. This is specifically targeted to images being exchanged in private communications. The issue here is about the potential of child predators grooming children. Parents are notified AFTER a child ignores or consents to the notification before viewing the content.

Further, this features allows parents to be less invasive on their children’s communication with additional tool to alert about potentially dangerous (not just explicit) communications.
 
  • Like
Reactions: DougieS
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.