Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I’m saying I want the same level of privacy for a photo I took of my TVs hdmi ports as I do for surgery, laboratory, and romantic events. The content of the image in no way diminishes my right to privacy.

This in no way protects children, and far as I can tell drives up abuse, not down. No one wins by scanning photos.

Why not hide the images we are concerned about?

I think a half-intelligent peddo would hide their photos. But I’m not one of them so I can’t really say if this is a common behavior.

Hiding illegal stuff is quite normal.

Not hiding nudes or other stuff that is legal, is up to you. I don‘t categorize this as illegal material so you don’t need to be on your guard as much.
 
  • Like
Reactions: BurgDog
Those are 1-1 synced iCloud images . How is this a back door but if only iCloud servers was scanned for the very same images, that wouldn’t be a back door ? . A back door is when a government has the key to the iPhone’s Secure Enclave where if the user has iCloud offf, even then governments can enter the iPhone easily . THATS a back door. Genius
Again, why put this on device if iCloud is already doing this? And this ONLY applies to iCloud photos. It makes no sense to suddenly put it on device.
 
I find this situation akin to my wallet. I keep my credit cards in Apple Pay because Apple said that all of the cards stay only in the phone. I use Apple Cash to pay my friends for my share of the dinner if we go out. I have my Tesla app in my phone so it acts as a key to my car. (I also have the Tesla card in my wallet.) My iPhone will probably be able to store my driver license in the future. Apple has already implemented school ID in the Apple wallet.

Whenever I add something to my physical wallet, a blind person, who's been trained to differentiate the many items that go into a physical wallet, will grab my wallet and the item from me and touch all over it. If he/she thinks that this item is suspicious, a person wearing the red&blue 3D movie shades takes a look at the item. If he/she finds it suspicious, they'll make a note of it until I reach 30 suspicious items. Then the authorities are notified.

But guess what? I don't need to commit a crime for this to happen. I'm already a suspect because I have a physical wallet. My wallet will go through needless wear and tear.

I don't think anyone will refute that the neural chip actively consumes a portion of the battery. The scanning will occur whether or not your iCloud photos are turned off in iOS 15. Apple will scan your photos for hashes in iOS 15. If your iCloud photos are turned off, the scanned photo hashes will not leave your phone at this time of writing.

This is how asinine and intrusive this process is. I don't know how to further dumb it down for the Apple defenders. And I didn't even mention how governments can force Apple to add their own hashes.

This is coming from an Apple user that agrees that iCloud should be scanned for CP. No servers should be hosting CP. Apple's duty is to make sure no CP is on their servers, not to serve as a government's lapdog.
 
  • Like
Reactions: BurgDog and Mega ST
I think a half-intelligent peddo would hide their photos. But I’m not one of them so I can’t really say if this is a common behavior.

Hiding illegal stuff is quite normal.

Not hiding nudes or other stuff that is legal, is up to you. I don‘t categorize this as illegal material so you don’t need to be on your guard as much.
You missed the point entirely. I want all my pictures private by default, and I shouldn’t have to hide them.

This change means I have to go out of my way to hide everything.
 
For someone that says everyone else doesnt understand how this works, you really dont.

The generate hashes for every image on your device. If you then enable icloud photos, and uploaded photos that fail the hash check, it uploads not only the photo, but additional data. Fail somewhere around 30 of the hash check and an apple person looks at the photo. If they think its kiddie porn, you go to jail.

This is different then having your photos scanned only in the cloud.
That is the other thing. What about this scenario.

Newly married couple in their early 20s. They both look young, look 16 or so. They spend most of their time long distance and like to share these adult images with each other via iMessage. How will the person at Apple determine that the wife/husband is of legal age? Do they have access to everyone's Driver's License to make that determination? Likewise, how will they product 16 year olds that look like they are 25?
 
There's no way to opt out other than to stop using iPhones (and later, macs). The second feature, I have issues with, since the on-device scanning part is compulsory when you upgrade to iOS 15.

It's not. If you turn off iCloud Photo Library, or never used it, the scanning and matching will not occur.

It's the main reason why many of us aren't worried. You can decide not to participate.
 
So… Were y’all okay with Apple scanning your iCloud library before this announcement? They’ve been doing it for awhile, since 2019 at least.

This is different. Why move it from iCloud to all of our devices if it is already doing it?
 
  • Like
Reactions: BurgDog
I just don't want Apple to be scanning iCloud period. It's a way to look over and go through our privacy. What if information gets leak to the government or the criminals. Who's held responsible for that?

Find an alternative way to catch criminals. And, why Apple is even getting involve?
Agreed. Apple needs to stay in their lane. Apple is putting themselves in the law enforcement lane.
 
  • Like
Reactions: BurgDog
Something occurred to me. This is going to be implemented only on iPhones, meaning that someone can upload any number of illicit images to iCloud via a mac and they won’t know.

This is not about safeguarding their server or catching pedophiles, otherwise they would include macos and ipados. This is about something else. It must be.

little snitch or upcoming programs would be able to disable this feature on a mac and if third party store and side-loading comes to iphones the same will happen. Making this invasion of privacy even more ineffective.

Why is there such a strong push to shape how we think about digital privacy? Why shouldn’t privacy be expected on our digital devices as it is in our homes?
Nope, this is also coming to Monterey
 
I don’t see that the System will give you any control or will show you on any way what is happening on your device.

Apart from that I wish you were right. And there would have been much less backlash if they had bundled this new functionality with full on device end to end icloud encryption.

It will give you control since you can turn it off by turning off iCloud Photo Library.

It's not showing directly but indirectly. If it happens on your phone you can look at the traffic going from the one to the Internet. You can look at the processes running on the phone using Xcode. You can see if the network traffic is different if you have zero photos or one photo in your library.

Experts would be in a much better position to look at what's happening locally. It's just difficult.

On Apple's servers it would be impossible.
 
This CSAM scan can’t rely on pixel perfect analysis otherwise it’s going to be weak. As for “category” of pictures, with enough slices, it is still possible to recognise object categories in the said photo. Does require more local storage to store category learning data but that’s mostly it imo.

CSAM doesn't recognize objects or put photos into categories. That's not how the technology works.

If someone took an illegal picture of naked child and then raised the arms to take another picture at a slightly different angle, and the former picture made it into the CSAM database but the higher angle didn't, CSAM would only be able to match against the first image.

If you cropped the first image, rotate it, etc...CSAM would be able to detect that too, but it cannot detect the second image. That's what fingerprinting does.
 
  • Like
Reactions: hans1972
You missed the point entirely. I want all my pictures private by default, and I shouldn’t have to hide them.

This change means I have to go out of my way to hide everything.

I agree with you. I’m against this implementation and pro-privacy.
I’m just saying it’s a bad implementation that is so easy to circumvent by CSAM-criminals. (Their default behavior is probably already avoiding this scan..)

For us normal users, it will be an extra step. This sucks. I‘ve disabled icloud photo and use photosync to backup to my linux server.

What I'm saying that this will suck for average user (me and you) and criminals are probably not saving CSAM-pictures in camera roll (or can adapt easy) so for them this is nothing.

This sucks for privacy and can be abused. And it will not save any children.
Lose-lose.
 
Last edited:
  • Like
Reactions: BurgDog and Mega ST
Apple should not be incorporating such surveillance on customers hardware, as there is no excuse for it, no reason for it, as if they want to put it as part of their iCloud then that's their choice and our choice whether to use iCloud.

Of course there's a reason. If doing it on device for images destined to iCloud Photo Library, they don't have to do it within iCloud. Another benefit is that the scanning is happening out in the open.

A second benefit might be that this will allow Apple to provide better encryption for iCloud. We would then gain end-to-end encryption for all of iCloud except child pornography.

You still have the choice. Turn off iCloud Photo Library and the scanning and the matching won't occur.

Anti-virus is an example of surveillance software which has been placed on customers hardware for a long time. Very few objects and you can usually turn it off.
 
Can I please clarify this, as I've been listening to this topic all week on podcasts.

There are 3 stages to this, and the one part of this which seems to be considered worse is this one......

If you are a child (when I say child, I mean child as classified by current laws in whatever country you care to pick) decides, they wish (for whatever reason) to take a image of some body parts to send to a close friend, then in the future their iphone will scan and detect their private photo they took, and if this scanning thinks it's detected something "naughty" Then the image will be sent to a human at apple to confirm if it is indeed a photo that is actually "naughty"
And then action/s can/may be then taken?

That's correct isn't it?
Yes this is a bit confusing to me as well. Speaking where I am, 17 is considered a child. How will this help protect those 17 year olds that look like they are in their mid 20s? Will the manual review process just shrug it off thinking they are 25? And just the opposite, how will someone not get treated as a criminal when they are mid 20s but look like they are 16? Do they have access to the drivers license?
 
  • Like
Reactions: Piggie and BurgDog
Of course there's a reason. If doing it on device for images destined to iCloud Photo Library, they don't have to do it within iCloud. Another benefit is that the scanning is happening out in the open.
And our private devices will be open for inspection 24/7? No, thank you.
 
  • Like
Reactions: BurgDog
No, you are assuming this algorithm must try to identify naked skin and if you have an image with naked skin in it, the system will be easily fooled.

No, the algorithm doesn't work like that.

No one has presented any evidence that other naked pictures (or large amount of skin) has a large degree of mismatch.
Well we can't test it out or have someone perform a security/third party analysis as its apparently just a black box. We can play around with it and determine what thresholds the neural match has.

I would love to get to the nitty gritty details of the code and algorithm and play around with it just to see, as looking at code helps me understand things better than human words.
 
  • Like
Reactions: BurgDog
Honestly the TWiT Network of podcasts is made up of very knowledgeable people who are very Apple focussed and wish to fully understand and explain all aspects and they have no personal agenda's whatsoever.

Honestly I'd very highly recommend this if you'd like to watch:

Mac Break Weekly


Enjoy :)
Yep agree! I have watched Leo Laporte since the old TechTV days.
 
  • Like
Reactions: Piggie
I’m not a lawyer, expert or even an American, but even I know the 4th amendment doesn’t constrain the actions of private companies. I’m fairly sure that nothing else in The US Constitution does either.
Is that some big legal loophole here going on then? Government can just get private companies to break the 4th amendment.
 
  • Like
Reactions: BurgDog
As much as we might not want Apple to do this via iCloud, it would not compromise individual systems, and the proposed method Apple will use DOES compromise systems, and for example many Uk government agencies utilise Apple devices in the field, many local authorities utilise Apple equipment now, and even agencies involved in fighting Child Abuse where such pictures are held and passed to fight the very thing Apple suggest they are doing this for, but where these agencies jobs will be harder and harder, and if Apple goes ahead with the proposed system via the billion or so customers equipment it has sold, then many of these could not use that equipment as it would compromise the specifications that are required for such work. A similar situation would occur where businesses use Apple devices, where companies may be dealing with sensitive work.

Are you saying public servants or people from non-profit organisations are going around with child pornography on their phones? And they are stored in the local photo library on the phone? How about iCloud Photo Library and iCloud backup today? If they are allowed to turn those on those images could end up on servers in foreign countries and the risk for breaking the laws in those countries would be huge.

Storing large number of child pornography on a mobile device seems extremely dangerous. These images should be extremely limited and not connected to the Internet.

Even in the US, law enforcement agencies don't have general access. Only a few select agents and people do. But in the UK they put this stuff on mobile devices. Pretty scary!

Maybe the UK should create a safe and secure "Android UK" version? Buying Android phones would certainly save British taxpayers some money.
 
So let’s assume Apple and rest of big tech manage to catch some of those with child porn. If it works then why stop there? Why don’t we scan for data relating to murders and other serious crimes? Scanning documents for financial crimes would most likely yield excellent results so we should do that also.

How would the police get the photos the murderer took so they could create the hashes and send to Apple?
How would you use this system if the murderer took no photos during the crime?
 
iPhones are updated in countries outside the USA by local providers on iOS level, everyone knows that, since Apple highlights this upload request. You will notice that when you associate your iPhone with a provider for the first time. This gives governments worldwide easier access to the iPhone than to Apple's cloud, which is probably too complicated and complex to analyze.

and for US (you know)
What is your intention of sending me a link about an NSL? Are you trying to say that even if Apple were to receive an NSL, they can still publicly discuss what is in the NSL?

What I said might not have been an NSL, but just some form of national security might prevent Apple from disclosing to the public their true intentions IF it is a national security issue.
 
I’m saying I want the same level of privacy for a photo I took of my TVs hdmi ports as I do for surgery, laboratory, and romantic events. The content of the image in no way diminishes my right to privacy.

This in no way protects children, and far as I can tell drives up abuse, not down. No one wins by scanning photos.

Why not hide the images we are concerned about?
This I agree with. I want to see the creators of these sick images be in jail or worse. Some sick individual that just comes across these images on a website or something and saves it (while sick -- I keep saying that), doesn't do anything about the actual abuse or creation of the image. Joe Someone that just is a sick individual that found something on some random website (or 30 somethings) and saved them gets in jail while Jim Someone that actually abused the kid is free to do so again with another kid.
 
Not hiding nudes or other stuff that is legal, is up to you. I don‘t categorize this as illegal material so you don’t need to be on your guard as much.
That is the thing though. How will Apple's manual review determine if something is legal or not other than the extreme "kid" variety? How will this protect 16 year olds (illegal and still child where I live) when they look like they are 25? What will prevent 25 year olds that look 16 from being falsely accused?
 
  • Like
Reactions: sog1927 and BurgDog
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.