Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
let’s make this clear: the iPhone creates the hash. The matching process happens on the could.
Not quite. The matching is done on the phone to generate the positive vouchers when there's a match. The hashes, database, whatever you want to call it, to be matched are built-into iOS15. What happens when you are connected to iCloud is that if there are positive vouchers reaching the threshold, they get decrypted, and iCloud put a flag on the account to be manually reviewed (along with the matched images, decrypted for inspection). It's all on Apple's own document. The first scan are done on the phone.
 
this analogy is completely incorrect. What’s actually happening is that before you leave your house, you create an inventory of all items you’re gonna bring across state and deliver that list to the FBI agent once you get to the border. That way, the agent does not need to go through your actual items.

let’s make this clear: the iPhone creates the hash. The matching process happens on the could.
FOR NOW.
 
You claimed scanning was ok. I was clearly saying that it's only ok if I agree and the scan is 100% for my benefit only. Your benefit is the money I paid.

That's fine, but Apple disagrees.

Apple implicitly says, if you want to use our property, you have to agree to a scan for all photos coming to our property, independent of if you believe it benefits you or not.
 
The problem is that Apple can easily turn it on in the background without telling you. And they can make subtle changing to it to copy every file.

Or Apple could combine the detection algorithms in Photo together with iCloud backup.

There are so many more effective ways Apple could do this and no one is worried about those.
And they can be forced not to tell you.
 
The problem is that Apple can easily turn it on in the background without telling you. And they can make subtle changing to it to copy every file.

Or Apple could combine the detection algorithms in Photo together with iCloud backup.

There are so many more effective ways Apple could do this and no one is worried about those.
Yeah, no one worries about this, just like lots of other people.
At this point, I am not going to be surprised for Apple to announce "Kids safety camera" feature that blurs certain objects (body parts, bloody scenes etc) when a kid is trying to use their iPhone to shoot something.
Now that I am thinking about it, iPod touch is probably more privacy-focused than iPhone, cause A10 might not have the hardware to do efficient on-device scanning.
 
  • Like
Reactions: 09872738
That's fine, but Apple disagrees.

Apple implicitly says, if you want to use our property, you have to agree to a scan for all photos coming to our property, independent of if you believe it benefits you or not.
In this case, "property" means every single iPhone, iPad, iPod Touch, Mac, HomePod, Apple TV etc being sold by Apple nowadays. :(
 
It shouldn't surprise us if some people here in the forum are working under Apple's payroll.

It's normal that a company like Apple monitors forums and pays people to post.

It's not normal at all. It is an untrue allegation without any evidence than your own wild fantasy.

Samsung on the other hand...
 
This is insane lol. You might be brainwashed by Apple but supporting this is a whole new level..

The messages pic just blew my mind, it's like when you write a comment on YT and it gets deleted because you triggered some of the blacklisted words. Lol imagine that happening on your own phone, while talking to a friend you know personally. Absurd
 
In this case, "property" means every single iPhone, iPad, iPod Touch, Mac, HomePod, Apple TV etc being sold by Apple nowadays. :(

No, it means iCloud Photo Library.

Also Apple owns all of its software you use on your Apple devices. You just own the physical device and your personal data.
 
Can I please clarify this, as I've been listening to this topic all week on podcasts.

There are 3 stages to this, and the one part of this which seems to be considered worse is this one......

If you are a child (when I say child, I mean child as classified by current laws in whatever country you care to pick) decides, they wish (for whatever reason) to take a image of some body parts to send to a close friend, then in the future their iphone will scan and detect their private photo they took, and if this scanning thinks it's detected something "naughty" Then the image will be sent to a human at apple to confirm if it is indeed a photo that is actually "naughty"
And then action/s can/may be then taken?

That's correct isn't it?

As far as I understood it from Craig’s WSJ interview, this is an opt-in child protection iMessage feature that will be alerting the respective parents.
 
Interesting because the country that seems so in tune with Apple and especially the story run in the New York Times

Censorship, Surveillance and Profits: A Hard Bargain for Apple in China​

: https://www.independent.co.uk/news/...obile-phone-face-scans-data-cpc-a9228371.html

Apple suggest they have not been leaned on and would not pass on data to government and I want to believe its true as the Apple I know would never do that, but given some of the comment in the article from New York Times, others might suggest they've already done that and Apple need to step away from this latest foolhardy destruction of their own brand, where privacy has been such an important pillar, one they have played well....up until this latest situation.

Apple gives law enforcement agencies in the US the entirety of your iCloud if served with a warrant or similar.
I assume they to the same thing in China.

I have no problem with such a policy.
 
Can I please clarify this, as I've been listening to this topic all week on podcasts.

There are 3 stages to this, and the one part of this which seems to be considered worse is this one......

If you are a child (when I say child, I mean child as classified by current laws in whatever country you care to pick) decides, they wish (for whatever reason) to take a image of some body parts to send to a close friend, then in the future their iphone will scan and detect their private photo they took, and if this scanning thinks it's detected something "naughty" Then the image will be sent to a human at apple to confirm if it is indeed a photo that is actually "naughty"
And then action/s can/may be then taken?

That's correct isn't it?
No. There are two separate "features."

The first one, the one you are describing, is just a local self "censorship" for child uner 13 accounts. This is opt-in basis, so you can just ignore it altogether if you want to. Most kids will lie on the birthdate on online accounts anyway, so they can get full features. This uses ML to "guess" some photos, and blurred it out and put out a warning if it thinks it's something offensive. That's about it. And the important part, it's OPT-IN. That's the key. You as a parent, will make the conscious decision whether to enable this or not.

The second feature, the CSAM part, is forced on your with iOS15, at least the on-device scanning part. There are two phases of the process. First, your device with iOS15 with scan and see if hashes of your photos will match with its database built-into iOS15. This will be done whether you want it or not. There's no way to opt out other than to stop using iPhones (and later, macs). If there are positive matches, the iPhone will generate vouchers. If you don't use iCloud photos, the process stop here. If you use iCloud Photos, after a certain threshold (Craig said 30), iCloud will flag the account and those positive vouchers and the photos will get decrypted to be reviewed and reported to authorities.

The first feature, I have no issues. It's an opt-in, meaning us, the users, have control whether we want to use the feature or not. The second feature, I have issues with, since the on-device scanning part is compulsory when you upgrade to iOS 15. The database is coded into iOS15, and it's opaque. There's no way for you to know what's being scanned (Apple doesn't know either, they rely on the hashes from other parties). It's like you're being searched for "illegal" stuff without you knowing what's illegal. Then on the iCloud part, Apple doesn't indicate a way for appeal process. It's basically you're trusting Apple to be the judge and jury based on something you cannot even check by yourself. This is the chilling part, especially once we consider what can be deemed illegal in some countries.
 
It is constantly looking at stuff in a device I fully paid for, I own. Not Apple. My employer owns any work provided computing system I use to perform my job. Their are free to audit it as they please. Apple forfeit all ownership of my personal iPhone when I paid them over 1K for it - they have no business looking into it. See the difference?

What's next? Car manufacturers will have always on camera and microphones in my car, looking constantly at them , so that they can determine if I ever kidnap anyone?

Haven't you read about Tesla's cabin cameras? They constantly watch you while driving and analyse what you do.
If you turn on data sharing it can be shared with Tesla.

Sounds a lot like how Apple's iCloud Photo Library operates.
 
  • Like
Reactions: JCCL
What I don't understand is why Apple chose to make this feature in the first place. They have always been about privacy, and even took measures to make it easier for them to fight government requests for information. These features don't benefit the end-users (although I suppose one can make a case about the sexting detection for parents) and it completely opens them up to government requests. Where did this come from?
I believe there might be further information if we know what Tim Cook got during his summer vacation at the Sun Valley conference. The timing is too sudden, and Apple didn't even touch this during WWDC. It might be a sudden decision.
 
  • Like
Reactions: BurgDog
Haven't you read about Tesla's cabin cameras? They constantly watch you while driving and analyse what you do.
If you turn on data sharing it can be shared with Tesla.

Sounds a lot like how Apple's iCloud Photo Library operates.
The Tesla cabin cameras are restricted physically to only look what's in the car. It doesn't go with you to also record your activities inside your home. A phone is a more personal device, as its content can reveal a lot more than what you do in a car.

You are being watched when you go into theaters to watch a movie (to prevent piracy). People has no problems with this as the activities inside a movie theater is limited. But having a camera to watch your activities inside your bedroom is a different story. Same basic idea, different concept and context.
 
There is probably pressure from several entities for Apple to start actively scanning iCloud especially for this kind of material.

This increases the chances that they won't need to do that in the future.
They're scanning your phone though. Not iCloud.
 
The Tesla cabin cameras are restricted physically to only look what's in the car. It doesn't go with you to also record your activities inside your home. A phone is a more personal device, as its content can reveal a lot more than what you do in a car.

You are being watched when you go into theaters to watch a movie (to prevent piracy). People has no problems with this as the activities inside a movie theater is limited. But having a camera to watch your activities inside your bedroom is a different story. Same basic idea, different concept and context.

There is no camera watching your bedroom but Apple could easily record such things without your knowledge.
You know what pictures you take. No pictures taken in the bedroom and its impossible to use this feature. And you will have to agree to use iCloud Photo Library to be impacted directly.
 
If it happens in the cloud you have no control or can't see indirectly at all what's happening.
I don’t see that the System will give you any control or will show you on any way what is happening on your device.

Apart from that I wish you were right. And there would have been much less backlash if they had bundled this new functionality with full on device end to end icloud encryption.
 
I took photos of my 3month old daughter taking bath for her first year album, thats mean that this algorythm consider that is abuse/crime/i dont know what and remove that photos from my icloud and iphone? what we done with this word?

No, you are assuming this algorithm must try to identify naked skin and if you have an image with naked skin in it, the system will be easily fooled.

No, the algorithm doesn't work like that.

No one has presented any evidence that other naked pictures (or large amount of skin) has a large degree of mismatch.
 
And HOW do they generate the hash? By reading the file.

I am not ok with Apple or any other entity reading the content on my device.

Yes, but they might not even read most of the file. What if they only read 25% of it, would that make you feel better?
 
No, it means iCloud Photo Library.

Also Apple owns all of its software you use on your Apple devices. You just own the physical device and your personal data.
Obviously in Apples view, I do not even own my personal data.
 
No, you are assuming this algorithm must try to identify naked skin and if you have an image with naked skin in it, the system will be easily fooled.

No, the algorithm doesn't work like that.

No one has presented any evidence that other naked pictures (or large amount of skin) has a large degree of mismatch.
It might lead to a mismatch though. And if you sent these photos to someone (for instance via mail), they might well be in the CSAM database.
 
It scans the image to generate a hash, if that hash is close to a kid porn hash you get flagged.

Just because it doesnt compare images like a human would, doesnt mean its not looking for naked people.

The algorithms are not looking for naked people at all. The algorithms in the photo app would be much better for that.

The NeuralHash had two design goals:

1. Finding images which are copies (or derivates) of images in the NCMEC database
2. Be extremely good at not finding images not in this database

It's #2 which makes this system so inherently bad at finding "people who protests", "gay people", "people with guns", "innocent naked picture of my children" etc.

If you, in the general sense, create your own child pornography, NeuralHash shouldn't find it. I believe also, even if you use the same children, but create new imagery, NeuralHash will not catch it.
 
After ploughing though a thread full of gaslighting and horror, I am going to follow the people that not only understand this tech but are horrified by it and its potential, The EFF, Edward Snowdon, who has first hand knowledge of what was happening years back in the NSA, the amount of backlash from organisations that exist to stop privacy breaches and create tools to help us, which are all saying this is bad news now and very much so further down the line. Governments are really happy though, that tells me all I need to know. The EFF article, the Professor from the US Johns Hopkins University, they know more than a bunch of Apple apologists on MacRumours I would imagine, so if they are worried, then I am.

Apple say they wont bend to governments trying to use this system for no good.I really cant see with pressure applied in the right financial areas Apple will have a choice, they are just a big company who makes lots of money with an over inflated view of its product line these days, which I will now back away from, and with sadness too. A Pixel 6a with Graphene sounds great as does some flavours of Linux, tbh anything but blindly following Apple because somehow Apple tech is somehow better than anyone else's, which is highly subjective and its Tim Cook's Apple so it must be safe, talk about the blind leading the blind, I think that part is more scary almost than this whole debacle sometimes.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.