Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well, it wasn't a complete or cogent sentence in the context of what you replied to....so...
I was very clear. That other poster said it was mind-boggling to him/her why people agree with what Apple is doing. I replied saying that the fact that majority of people posting here don't agree with what Apple is doing for the rest of the people that do their opinions should be respected. There's no right or wrong thinking. Everyone is entitled to their opinion. Got it? If not I can't explain it any better.
 
Those who are complaining obviously did not read how the technology works.

You have a higher chance of winning the lottery than Apple erroneously looking through your photos.

No, we read it and it's a real invasion of privacy.

whether its a machine or a person, they are still going through all our photos. Apple is doing both.
they are using a machine to go through everyones photos. then they are getting an actual person to look at our photos when matched.

using hashing is nothing special. if u are going to get a computer to compare photos its the normal way that this would be done in code. It doesn't improve privacy.

the fact that an apple employee can review our photos shows that they have a backdoor into the encryption and so their encryption is totally useless.

what if they review it and it turns out to be a naked picture of your daughter or wife or yourself. or maybe it's a photo of a highly confidential business document. Totally unacceptable.

what if the government (eg the NSA or a politician) decides they want to have a look at your stuff or ban certain ideas. We already saw the censorship of what turned out to be the truth on Facebook and Google.
What if the government gives them the entire hash namespace and asks them to give them all your photos?
The Taiwan flag and Winnie the pooh memes, painting your face black, photos of police, porn, references to labs, gay people, bikinis, are illegal in some countries, I can assure you those will be added.

What if a hacker or a friend doing a prank decides they want to fk u. I can get you arrested and **** your life simply by sending u a bunch of kiddy porn or photos that will trigger matches through WhatsApp or other app that puts received files in your photos library. I can hide my tracks by then deleting messages.

Whether or not other companies do it is totally irrelevant. It's unacceptable there too.
Apple are assuming you are guilty and taking it on themselves without legal process to search your photos every day in real time.

No you can't just turn it off. apple devices and apps on the App Store are so tightly integrated with iCloud these days, u can't do that without losing significant functionality.

This is 1984.
 
Last edited:
Google's (far superior) object detection has been around for years, and has had plenty of press. Perhaps people only interested in Apple-related news are shocked to hear that this is a thing, but I can all but guarantee that politicians are not just hearing about this now.
I'm aware of Googles object detection. To conflate that with what Apple is doing is disingenuous, Apple is detecting nudity in the messages app for incoming and outgoing messages. This isn't detecting something in your photos app and telling you its a cat or a dog. While I concede Googles object detection could be added to their own messaging app to accomplish the same goals the alerting feature and all that would be new and not a fully finished idea which is what courts would look at when it comes to how much they would burden a company with a decree.

This is partly how Apple defended themselves against the department of justice in the FBI phone unlocking case. Compelling Apple to spend employee time (which costs money and would delay other product releases) to build in a function to the iPhone that would allow for unlimited attempts at the passcode was an unreasonable request under the 1789 All Writs Act that the DoJ was attempting to use as a bludgeon to get what they wanted. Ultimately the DoJ withdrew the case against Apple without resolution.

Had Apple already created such a feature though, for instance for the CPP of China that could at that time only be used on Chinese sold iPhones you bet the court would have sided with the DoJ almost immediately and ordered Apple to make that firmware accessible to the FBI on North American sold iPhones.

That is the difference, I mean even Apple is arguing against your point here in their own legal situations with the US government.

Your example is not Apple folding --- there are storage laws in China that Apple must comply with if they want to do business there, which in no way affects iOS. I stand by my original assertion: this doesn't open up some new capability for bad actors in any way whatsoever.
I know there are storage laws in China. And I stand by what I said earlier, they folded to Chinese oppression. They decided that the privacy of their customers had a dollar amount attached to it and they would rather keep selling phones there than to exit the market.

They chose money over privacy and they will continue to do that when the next battle occurs. If the DoJ had won that court case against Apple over the FBI phone Apple would have put the backdoor in the iPhone, they would never stop selling the iPhone in America.

I want to quote something to you from Tim Cook on that FBI case:

A Dangerous Precedent

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

I highlighted the part that Tim most feared. Well guess what? Apple just did exactly what he feared now it's up to the Governments to request this feature extended to detect in peoples messages whatever they like. Just like how the CPP does in China with WeChat. You can literally not write certain words and phrases in the app, they are blocked by government decree.

Apples stance that messages are end-to-end encrypted and thus cannot be altered by Apple just had a big brick thrown through it that repressive regimes can point to and say "But you detect this in the messages so just build a profile of things we want to detect too".
 
Apple really has screwed this up. It’s not a surprise this wasn’t talked about at WWDC.
Why in the world would Apple talk about this at WWDC? That's a developer conference for developers to create their apps. Also since when would any company talk about controversial things they are doing at live conferences?
 
I think it is valid and worthy to be sounding the alarm at the potential for misuse in and with the program. This new program hasn't been beta tested in the wild under normal conditions. As such, I don't think it wise for those in support or indifferent about on-device scanning to be so dismissive of those posting to the contrary.

Just because Apple has stated thus and thus, that doesn't necessarily mean thus and thus is going to happen as it should each and every time with every person using iCloud photos.

If this were a well-seasoned program, (read: vetted over several years) I could understand the staunch support for Apple support docs in this area. Since that is not the case, I think everyone should be cautious and understanding that many of us here have valid concerns. Disagreeing with someone doesn't make us dumb or a conspiracy theorist etc. Some are posting exaggerations out of anger. However, look at the root of the concern and leave the insults aside. We are all in this together. It's not a matter of to trying to be right with every rebuttal in the thread, at least it shouldn't be. If that is your focus, you are missing the bigger picture.
 
whether its a machine or a person, they are still going through all our photos. Apple is doing both.
they are using a machine to go through everyones photos. then they are getting an actual person to look at our photos. when matched.

WRONG. This tells me you actually did not read and/or did not understand what Apple is doing with the technology.

Actual person does not look through *ALL* of your photos.
Also, the actual person looks at the matched *DERIVATIVES* of photos. The actual reviewer will not be seeing the *ORIGINAL* photo.
The matches also must reach a very high threshold or else that person cannot unlock the visual *DERIVATIVES*. Threshold is high enough where there is a 1 in a TRILLION chance that an error might be made. You are more likely to be struck by lightning.


the fact that an apple employee can review our photos shows that they have a backdoor into the encryption and so their encryption is totally useless.

They've always had this since the introduction of iCloud. iCloud photos was never end to end encrypted. Nothing new is changing with regards to encryption. Government has always been asking Apple to decrypt data for them.

what if they. review it and it turns out to be a naked picture of your daughter or wife or yourself. or maybe it's a photo of a highly confidential business document. Totally unacceptable.

The reviewer is only reviewing a *DERIVATIVE* after a *HIGH THRESHOLD* of matches have been linked against a database of known child pornography images. The *DERIVATIVE* will not be clear enough to read documents. Also, companies tell their employees to not use iCloud backups for that very reason.

You've confirmed my assumption. You did not read and/or understand the technical document Apple released.
 
I think it is valid and worthy to be sounding the alarm at the potential for misuse in and with the program. This new program hasn't been beta tested in the wild under normal conditions. As such, I don't think it wise for those in support or indifferent about on-device scanning to be so dismissive of those posting to the contrary.

....

We are all in this together. It's not a matter of to trying to be right with every rebuttal in the thread, at least it shouldn't be. If that is your focus, you are missing the bigger picture.

Completely agree. Great post and points AR (I pulled a bit of it to the front to highlight it - hope that's ok)
 
Wonder how all these people commenting here feel about electric cars and how they know everything about you as well (locations, speed, driving inputs)? Are we going to avoid those as well.
 
  • Like
Reactions: hans1972
Apple is evil, we know that. They kicked out a bunch of apps last year due to politics and they kowtow to China and probably Russia. This latest scheme will morph into scheme2, scheme3, scheme(infinity). Can somebody resurrect BeOS for gawds sake? Why doesn't the EFF start something up?
 
  • Like
Reactions: turbineseaplane
Wonder how all these people commenting here feel about electric cars and how they know everything about you as well (locations, speed, driving inputs)? Are we going to avoid those as well.
My wife and I recently bought a new car (still hasn't arrived due to chip shortage but our order is in!) and this was a consideration for us. We did not like all this data being collected and it's one of the reasons (amongst others) that we chose not to purchase a Tesla.

We ended up getting a hybrid vehicle that to our knowledge doesn't contain this stuff, doesn't do over the air updates, has no cellular modem etc
 
  • Like
Reactions: Philip_S
Apple really has screwed this up. It’s not a surprise this wasn’t talked about at WWDC.

I bet you they sell even more phones this year despite this announcement and all the uproar it has caused too. People have been discouraged from having critical thinking skills so they can’t see that it’s not a question of IF the surveillance will be expanded but WHEN.
 
  • Wow
Reactions: btrach144
Wonder how all these people commenting here feel about electric cars and how they know everything about you as well (locations, speed, driving inputs)? Are we going to avoid those as well.

Is it going to automatically report me to the police if it sees me speeding? ;)
 
  • Like
Reactions: ececlv
Is it going to automatically report me to the police if it sees me speeding? ;)
No it simply won't allow you to exceed the speed limit placed on the touchscreen GPS window. This tech as been around a long time for golf carts. If you leave a golf course with one of their carts the GPS built into the controller disables the cart. Not a far stretch to use similar tech to lower or control max speed of EVs.
 
I am re-posting something I wrote before

- If Apple is about to do what Google does, why not go with Google services which are much better anyway? Some of us, chose Apple's inferior services for privacy.
This is only a decision the individual can make. If one likes the google ecosystem better that's what one should go with.
-I keep tons of medical records and photos stored in my phone - icloud photos is OFF - I have no idea if it's legal to have my photos scanned in my device as they contain a lot of private medical information. I don't want medical data to be scanned without my permission.
The above isn't CSAM and won't match the hash on any database. As far as the legality, HIPPA rules should guide one on that. However, it seems one can't stop the hashing, however, if one doesn't upload to icloud these photos should never go through the scan process.
- You recommend to deactivate icloud photos - I don't use it anyway but the majority of users do - Isn't Apple ecosystem a big selling point so far?
With this new information about CSAM scanning, some may make an additional judgement as to whether the Apple ecosystem is for them. However, moving to google ones' photos, it seems will still be surveilled.
- You recommend to ditch iphones/ipads. I spent only this year a lot of money on apple devices because I chose privacy as they advertised it and because they will be supported for years - so I do want to upgrade my devices to future OS versions. Only this year I bought a new iphone and a new ipad. I am not going to burn my money just because Apple decided to kill privacy as it was initially advertised for years - and therefore affected my choice of purchase. I am glad if you are so wealthy and you have the ability to simply ditch brand new devices but many of us are not.
Not everybody spent "a lot money this year" on Apple devices. One may have an iphone 6s that needs to be replaced, older ipad, older mac etc. If one wants to leave the Apple ecosystem due to philosophical differences now would be the time, imo, to do this. It's an individual decision balancing philosophy against cost to leave.

Apple didn't kill privacy. Your PII is still treated as before. Apple is attempting to stem the distribution of CSAM, which is illegal. There is a big difference between "killing privacy" and preventing distribution of CSAM.

I am not wealthy, as in the top 2%. However, if my philosophical differences were more than the cost to abandon the Apple platform I would sell my equipment and do what I need to do and find a way to do it. Otherwise the words ring hollow.
 
Wonder how all these people commenting here feel about electric cars and how they know everything about you as well (locations, speed, driving inputs)? Are we going to avoid those as well.

Not going to avoid it - we are going to be vocal about things we don't like and work to enact change.

That's literally how this is supposed to work in free and democratic society (as much as it is anymore....sadly)
 
Only if you believe apples bs numbers.
so instead of upping the amount of matched images threshold by adjusting one line of code to reach those odds, they lower it but then lie about it which opens themselves up to a lawsuit in the future just so that they can capture more pedophiles because...that'll sell more iPhones?

sounds nonsensical.
 
Second Apple must comply with the laws of the places where it does business. So in a country where homosexuality is illegal what happens when a government goes to Apple and says we have hashes of pictures of same sex couples displaying affection in public and those need to be found and reported to us as we want to arrest these individuals for illegal activities and possibly do much worse to them.

Let's say the government takes one picture of a same-sex couple hugging and Apple has to make a hash of that.

CSAM detection will only detect:
People who have that one picture

It will not detect photos of:
The same couple hugging in a different pose
The couple doing something else
Other same sex couples at all

It is a completely ineffective system for such a purpose.

Now, the Photo apps with its face recognition its much better. So why haven't government forced Apple to extend that app?
 
  • Like
Reactions: YannickAlex07
today it is Apple scanning for child pornography and reporting you to the government. Tomorrow it’s Apple scanning for and removing pirated music/books. Or scanning messages/email in an attempt to thwart an attack. Apple is the new Xiaomi.

People will say you're being hyperbolic - I don't think so at all..

Don't think for a second that movie studios (just picking something random) don't want to enforce copyright down to you even freakin' talking about their content -- let alone sharing clips of it in messages, etc.

Hey, maybe if I start texting a friend about Top Gun 2 in theaters we can have them scan it and inject ADs and showtimes from AMC and then Apple can take an advertising cut from them ---- just because I "talked about it" with a friend in an iMessage.

That sounds "fun"
/s
That’s what already happened in some countries to the ISP filters that were only there to stop child porn: creating a filter was an unreasonable burden for ISPs, but adding a load of extra perceptual hashes to a database is easy, so not doing it is contributory infringement.
Just switch platforms (phone, computer, tablet, watch, etc) and your privacy and security will be top notch. Check out Google and Microsoft for your next products.
MS were considering scanning all local and mounted files for pirated material, and while AIUI they backed down, but I’m sure they’ll sneak it back in later.

linux or BSD seems like the only secure option
Do people seriously not know that your gmail, yahoo mail, facebook, insta, etc. are all scanned already?
But only on their servers, and we know what is uploaded to their servers.
The problem is Apple has seriously altered the TOS, Apple does not own the phone just the software that is leased to you, and I do not imagine them installing CSAM software to monitor your photos is going to go over well with a judge since Apple does not own the phone.
Unfortunately the law almost everywhere is pretty much that they can do what they like and you can put up with it or manage without. Aside from a few privacy laws, there’s almost nothing that’s considered unconscionable in a software licence

As far as I'm concerned that IS questionable material. Why in the world would a child send a nude of themself? Something is wrong with the parent if that's going on.

Many places the age of consent is less than 18, including many American states (IIRC the majority). I bet you went to school with a few people who had sex before 18, even if you didn’t.

What amazes me is the consumer uproar against this change when people have so readily adoped smart assistants.
Apple claim they only upload data after you say the activation phrase and they show the display indications, but even so I expect these aren’t the same people
Very well said. I always think it’s funny when people immediately cite China instead of the US, particularly after the administration we just (narrowly) escaped had been caught repeatedly breaking the law and flagrantly lying about it.
Probably beciase there’s some people who beleive that Biden will undo everything trump did (just like Obama undid everything Bush did /s), or that now that he’s been elected only good people will be elected in future, or whatever. China is just mask-off enough that you don’t have to bother to convince people that they would do whatever it is you’re concerned about.
The problem with your premise is that this tool cannot be configured to do other things. If Apple wanted to (or was forced to, as you suppose), they could have installed a proper backdoor that could actually surveil people, and they wouldn't even have to publicly announce it!
There’s a difference between having the capability in general and it being reasonable to require them to use it. That’s how they got out of providing unlocking firmware under the All Writs Act, and it’s why they haven’t been sued for facilitating piracy.
 
  • Love
Reactions: turbineseaplane
WRONG. This tells me you actually did not read and/or did not understand what Apple is doing with the technology.

Actual person does not look through *ALL* of your photos.
Also, the actual person looks at the matched *DERIVATIVES* of photos. The actual reviewer will not be seeing the *ORIGINAL* photo.
The matches also must reach a very high threshold or else that person cannot unlock the visual *DERIVATIVES*. Threshold is high enough where there is a 1 in a TRILLION chance that an error might be made. You are more likely to be struck by lightning.




They've always had this since the introduction of iCloud. iCloud photos was never end to end encrypted. Nothing new is changing with regards to encryption. Government has always been asking Apple to decrypt data for them.



The reviewer is only reviewing a *DERIVATIVE* after a *HIGH THRESHOLD* of matches have been linked against a database of known child pornography images. The *DERIVATIVE* will not be clear enough to read documents. Also, companies tell their employees to not use iCloud backups for that very reason.

You've confirmed my assumption. You did not read and/or understand the technical document Apple released.

what do u think a derivative is. its a copy. if they look at or scan any of your photos in any form, derivative or whatever, whether its a correct match or not, its an invasion of privacy. whether its a person or a machine, its unacceptable.

doing this at all is not acceptable whether its in the past or now or in the future.

also apple do not tell u what the threshold is. a high threshold could be 0 photos or 1 photo or it could be 2 or it could be 10. regardless, any threshold is unacceptable.

no one should ever be able to go through our photos ever. any scanning or review of our private photos is a total invasion of privacy. its none of their business whether they like what our photos are of or not or whether they are legal or not.

any government (or apple or anyone that apple wants) can provide the entire namespace of hashes to look at every photo they want. apple do not tell us what hashes are included. they have probably already included the whole hash namespace, which means they can view every single photo.

We understand apples documentation fully.

This is policing of thought speak. 1984 in its fullest.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.