Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The whole point of the system is to keep the photo off the cloud, to stop it being uploaded if it matches. If a number of photos match, Apple will conduct a human review, as the photo has not been uploaded to iCloud, remember the system has detected the photo and matched it, and it being designed to stop the photo being uploaded to iCloud, will not upload it, will then be scanned by a human review, on device.

Here is where you go wrong.

The system will not stop the photo from being uploaded even if it was matched. If they didn't upload it, it would be easy for Apple to determine you had CSAM images.

All photos will be uploaded to iCloud Photo Library if you have it turned on.
 
What concerns me is the high number. Since that could indicate their system has a high false positivity rate for it to be set at 30.
Perhaps they know all they have to do is make a “good faith” effort to keep their immunity intact from Sec 230 of the CDA? They don’t need to have one positive to trigger a review, they can set the bar pretty high and keep their liability shield
 
  • Like
Reactions: Stunning_Sense4712
So what I'm hearing is that once I have 30 pictures of my baby, Apple will start looking at them taking baths? No thanks.
You didn’t read how the system works, did you?

The only way that could happen is if your child’s picture is in the exploited children database of photos.
 
What concerns me is the high number. Since that could indicate their system has a high false positivity rate for it to be set at 30.

It's the very reason why they introduced a threshold feature. By setting this high is how they can achieve a "1 in a trillion account per year" number.

Another reason is that in the US its not illegal to have 1 or 2 such images, so it should be set at least 3.
 
To make it easy, just remember that on-device intelligence is ALREADY scanning your photos. How do you think object detection works??? It’s the same concept, but for illegal content
 
Slippery slope arguments are logical fallacies. You can deal with these issues separately. Apple should do what it can to stop perverts from hurting kids.
Cool - so let Apple come to dinner at YOUR house and poke around in your drawers all they want. I won’t let them do that in my house. Even Apple previously refused to build a back door into their system because of what might happen in the future. So, I’m glad you feel confident in your logical fallacies argument.

Again, I’m all for protecting the kids. But, shouldn’t that be the responsibilities of our government and law enforcement and NOT from a group of people building phones and computers?
 
Cool - so let Apple come to dinner at YOUR house and poke around in your drawers all they want. I won’t let them do that in my house. Even Apple previously refused to build a back door into their system because of what might happen in the future. So, I’m glad you feel confident in your logical fallacies argument.

Again, I’m all for protecting the kids. But, shouldn’t that be the responsibilities of our government and law enforcement and NOT from a group of people building phones and computers?
How is law enforcement supposed to find out about illegal content if not through notification from service providers?
 
I think you lack a basic understanding of technology.

There will be a scanning system on your device.
At any moment that scanning system can be used to scan.... for something else.
Its simply a matter of changing lines of code.

Thats the problem.

That's also true about the code in the Photos app which is extremely good at finding pictures which fits a certain category.

I have about 30 000 photos. I search for "sushi" and it finds the one picture of sushi I have.

Think about extending the categories Photos are good at to "images with scenes from protests" using machine learning.
 
Last edited:
How does this change the fact at all that there’s now essentially a new backdoor to be abused that’s installed in iOS 15?

Stop defending and get rid of this BS, Apple.

stupid accusation. before you install the software update for "expanded scope", you let the researchers check to see if it really did expand the scope to be "essentially a new backdoor".
 
Just so all are clear..

Apple are risking their entire narrative of privacy and potential downstream future implications of building in tools to scan user content on their own devices to catch:

Pedophoiles who use Apple Products and store CSAM in their camera roll and also use iCloud to collect known CSAM, who also have more than 30 photographs of known child pornography and who are going to read all of this news and still carry on as usual.

Incredible...
 
Just so all are clear..

Apple are risking their entire narrative of privacy and potential downstream future implications of building in tools to scan user content on their own devices to catch:

Pedophoiles who use Apple Products and store CSAM in their camera roll and also use iCloud to collect known CSAM, who also have more than 30 photographs of known child pornography and who are going to read all of this news and still carry on as usual.

Incredible...
Just so we are clear… Apple, by law, has to make a good faith effort to keep child pornography off their platform to keep their immunity under Sec 230 of the CDA
 
So countless of examples of exploited mass surveillance systems are no rational basis?

Oh, please. This isn't a "mass surveillance system" - which implies Apple wants to know about everything that's on our phone. All that's happening is that illegal images are being flagged and Apple is only notified if a good number of those are uploaded to THEIR servers.

Only considering things for which there is concrete evidence is precisely being naive. This isn't about proving anything, it's about ethics.

Yeah, how "naïve" of me to require evidence to support claims of wrongdoing. I guess we should start hanging people before they have a trial too. I find it hilarious that you're trying to claim the moral high ground with this.

There's obviously no reasoning with you. So I'm not going to go back and forth with you on this further, as all we'll be doing is repeating ourselves. You've already made your mind up and closed it. My mind is open, in that if I see actual evidence of wrongdoing, I will acknowledge it. Until then, I presume Apple is innocent.

👋
 
To be clear, every phone manufacturer does this. Some have even more invasive techniques for filtering illegal content
 
  • Like
Reactions: VTECaddict
From Reddit:

snitchOS.jpg
 
It’s the local scanning that bothers me. I see it as a weapon of mass surveillance. Even mighty Microsoft had to let China examine their Windows source code.

I fear the day when that local scanner looks beyond its scope and I can never trust Apple to do the right thing with this specific technology.
The local scanning has been happening for quite a long time. For example in People & Places, AI is used to detect faces and faces from the same person. However, it is true that this time is a bit different and apple is crossing a line. At least, according to what apple claimed, they don't touch the information from local 'scanning'. This time, apple is accessing the 'scanning result' basically.

Techniquely, the 'scanning' uses a deep learning algorithm to determine the features of each photo. The features can include every object (like faces) in a photo. I think to be classified as THE photo, apple requires a few objects present at the same time in a photo. The current algorithm should be able to achieve very very low false-positive classification (should be easily less than 1%). However, still, it is crossing the line if apple accesses the classification result.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.