Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Even though hashes may contain limited data, it is still a backdoor,[...] because that is where we separate child safety and surveillance, and via our hardware it is surveillance and even if it was kept to child safety, which is very unlikely in my opinion, it still represents a backdoor, which is why those arguing solely on child abuse or minutiae of its workings have the situation so wrong as its still a backdoor waiting to be exploited by governments/hackers/despots, [...]

"A backdoor is a typically covert method of bypassing normal authentication or encryption in a computer, product, embedded device [...]" -Wikipedia

It's not backdoor because it's not secret! You could argue it's an extra door which isn't as secure as the front door but backdoor is reservert for secret methods.

[...] which is why those arguing solely on child abuse or minutiae of its workings have the situation so wrong as its still a backdoor waiting to be exploited by governments/hackers/despots, [...]

When it comes to privacy, security and encryption the minutiae is important. It can be the difference how having great privacy or not, exceptional good security or not.

I would never form opinions or make decisions on privacy and security without knowing the details of the implementation combined with the larger picture of political and societal reality we are living in.
 
  • Disagree
Reactions: So@So@So
Could. And in the end that's what it boils down to. Could.

and...

"Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments."

Some will trust Apple's assertion. Others won't.

Those who don't should VOTE WITH THEIR WALLETS. Of those who don't trust Apple, I suspect a very tiny number will step up and follow through. That requires courage.

.
The whole point of proper end-to-end encryption is that you don't have to trust anybody. Apple has designed a system that is not cryptographically safe anymore since you have to trust them.
 
Exactly!
How I adore Tin Cook talking about these things not that long ago, at the end of this film!

Apple, STOP this nonsense.
Anyone looking forward to his speech at WWDC 2022?
"We've had a veeery succesfull year. I'm sooooo happy to be able to tell you that thanks to our efforts the past year, over FIFTY THOUSAND dissidents and whistleblowers are now in prison.
And thanks to the corroborating evidence we were able to supply, trials on average took 81.7% less time.
Yes, THANK YOU, round of applause for our team please"
 
Where did you get that idea?

Say I am a person with problem photos and I want them on iCloud.
Okay, so I turn Backup Photos to iCloud off. CSAM stuff is inactive.
Now I manually upload these to my iCloud account.
Apple is not currently scanning iCloud and has no plans going forward unless issued a warrant / subpeona / other.

So what exactly does this change?
Reread the post I replied to please ;)
 
  • Like
Reactions: dk001
The whole point of proper end-to-end encryption is that you don't have to trust anybody. Apple has designed a system that is not cryptographically safe anymore since you have to trust them.

And that's why you should vote with your wallet if you don't like Apple's position.

Will you step up and commit to doing that?
 
"A backdoor is a typically covert method of bypassing normal authentication or encryption in a computer, product, embedded device [...]" -Wikipedia

It's not backdoor because it's not secret! You could argue it's an extra door which isn't as secure as the front door but backdoor is reservert for secret methods.



When it comes to privacy, security and encryption the minutiae is important. It can be the difference how having great privacy or not, exceptional good security or not.

I would never form opinions or make decisions on privacy and security without knowing the details of the implementation combined with the larger picture of political and societal reality we are living in.

I really don't care if it's a backdoor by definition (one could argue so, see: "typically") to me it's a backdoor and spyware that somebody who I paid for my device wants to install on it.

My answer is simple and plain: NO
 
Wasn't intending to post again but legal cases demonstrate they are not considered a law enforcement agency. I posted information earlier and court transcripts from the US are available.

If they were then it would not be admissible.

That is what allows government agencies etc., to bypass the 4th Amendment because if they were considered government agencies, the 4th amendment would not allow their actions. It is because they are considered private entities and a PRIVATE SEARCH.

Good read about how tricky the government entity situation is, and how it's not completely settled. Courts have leaned towards agreeing that NCMEC is a government entity though.


> Arguments accusing NCMEC of state-actor status have fared better. The 2016 case United States v. Ackermaninvolved another defendant who, like the defendant in Stevenson, allegedly sent CSAM using his AOL email account, triggering AOL’s filters and thus a report to NCMEC. 831 F.3d 1292, 1294 (10th Cir. 2016). AOL reported the email, with four images attached, to NCMEC, where an analyst opened the email and viewed the attached images, confirming that all four were CSAM. Id. The Tenth Circuit agreed with the defendant that NCMEC was a “governmental entity or agent.”
 
  • Like
Reactions: dk001 and jseymour
They will be worse for almost everyone if they aren't even going to use Google services and apps.

To use most of Google services and apps including their app store, you need Google Play Services, which contains scanning software. This scanning software scans regularly at least part of your file system.

Google Play store is scanning for only possible malware or other harmful apps. I believe it’s tied to Google Play Protect which is trying minimise the possibility of installing harmful applications.

Anyway, Apple too registers the apps installed and has kill switch if for some reason actual malware gets accepted to App Store. These all are actual measures to protect the user.

It’s worth mentioning that Apples client side CSAM scanning is the first time any platform has implemented a method of scanning every users data in order report the offending parties to authorities.
 
  • Like
Reactions: BurgDog and dk001
It’s worth mentioning that Apples client side CSAM scanning is the first time any platform has implemented a method of scanning every users data in order report the offending parties to authorities.
That's not true unless you mean client side just prior to uploading to iCloud. Most other cloud providers have been using PhotoDNA for years to scan images on the other side of the upload and reporting millions of offending photos to authorities every year.
 
  • Like
Reactions: dk001
iCloud for Windows is a program and they could to the scanning there. Also they could stop uploading from the web. Or they could only scan in the cloud if you upload from a non-Apple device.

But this system will allow Apple in the future to provide end-to-end encryption in iCloud for users who accepted no upload from Windows and the web.

With server scanning there is no way it could be implemented.
Big deal, if there's a backdoor to get to that data, e2e is worthless. Not that Apple has announced e2e encryption!
 
  • Like
Reactions: marckgoran
That's not true unless you mean client side just prior to uploading to iCloud. Most other cloud providers have been using PhotoDNA for years to scan images on the other side of the upload and reporting millions of offending photos to authorities every year.

Yes, that’s what I’m saying. However, Apple has been doing iCloud server side scanning most likely since May 2019 (according to their privacy policy). Also, if you have iCloud Photo turned on and you save an image to photos gallery in iOS it will be then straight after transferred to iCloud. Therefore, scanning is happening automatically in the background and requires no active user input other than acquiring the image.
 
There is a thing called escrow agreement.


Thay always had access to your photos and messages … you gave the permission todo so.
No if you did not upload them to iCloud they had no way to access them remotely on your iPhone. iMessages were also encrypted and they had no way to access them unless you put them on iCloud. This is not me speculating but rather fact because the US government has demanded Apple give them access to iPhones that were locked and Apple stated they were unable to do so.

There is so much confusion on this subject of what Apple has access to and what they don’t have access to. I’ll try to explain it.

Before this the only thing Apple had access to was whatever you uploaded to iCloud. Yes iCloud is encrypted however Apple has the encryption keys to unencrypt it so it’s basically like you’re putting stuff in their safe but they have the combination as well as you.

With this new change Apple now has access to what is on your iPhone. Exactly what level of access is unknown as in we don’t know if they can log keystrokes or only access iMessage and photos. The only people that know for sure would be Apple because they know what’s in the program that enables the access your iPhone. I don’t think they will ever show us what their interface looks like and what kind of access it gives them.

I think if Google or Facebook did this everyone would just be like yeah well that’s to be expected but because this is coming from Apple the company that claimed what’s on your iPhone stays on your iPhone it’s a bit jarring. They claimed to be privacy focused and now everyone is realizing that was just a bunch of BS. I’m not hating on Apple because I have a many Apple products and probably will buy more in the future but it’s just the reality of the situation. I’m not going to put on rose colored glasses and pretend everything is great when it’s not.
 
  • Like
Reactions: BurgDog
Yes, that’s what I’m saying. However, Apple has been doing iCloud server side scanning most likely since May 2019 (according to their privacy policy). Also, if you have iCloud Photo turned on and you save an image to photos gallery in iOS it will be then straight after transferred to iCloud. Therefore, scanning is happening automatically in the background and requires no active user input other than acquiring the image.
Yes everyone knows this or at least quite a few people did because it wasn’t a secret and I think that was acceptable because even though it’s your data it’s going on their server and their hardware so they are scanning it for possibly illegal content. I think everyone would’ve been okay with that. If Apple announced that they were un-encrypting, scanning then re-encrypting all my data on iCloud I would be like okay whatever it’s on their server so I’m okay with that. I put all my stuff on iCloud and I’m okay with Apple having access to it. What I’m not okay with and I think most people are not okay with is them having the capability to remotely and secretly scan data on their device. This is the equivalent of me remotely logging onto your computer and checking out your files without you having any way of knowing what I’m doing. It’s just creepy. Everyone is against CSAM. The problem is that it’s just the excuse Apple is using to justify this crap. The real reason is they are caving to governments that want access to iPhones.
 
People who ask researcher to read Apple FAQ, must be blind and not understand anything. Even with the FAQ, Apple still need to verify the photo detected by CSAM physically to ensure the accuracy and verify the algorithms. So where is the privacy at that point? If the image is right, it's fined but if wrong? Apple worker get watch random photo for free.

It not just about Apple worker, CSAM technology not been build by Apple but library from 3rd parties. If there is a bridge between two services, there is a hole.
 
  • Like
Reactions: Euronimus Sanchez
Also I think part of this at least psychologically is people know stuff on iCloud is basically public domain. Yes it may be secure but it’s on the Internet so whoever has access to it can access it. People associate what’s on their personal device as personal property. When you go on my property to check out what I have it’s different than going somewhere else. I think it was explained in the YouTube video that basically if you’re renting a storage unit those people might have access to your storage unit to see what’s in there and that’s fine but what if they wanted to come to your house and inspect it to see what you might put in there. That’s the difference and I know it’s not exactly the same thing but it’s the same feeling people get. One is a remote location on someone else’s device while the other is with them on their personal device. Also in a bad situation say for instance I go on vacation to X country (Trying to leave politics out of this so will just call it X country). I’ve done this multiple times. Right now I can just turn off iCloud and my phone is secure. Short of being forced to enter my pin code nothing on my phone can be accessed. With this change I don’t think my iPhone would be secure. I know Apple hasn’t stated they would give governments access but in the past they’ve complied with less than democratic governments for information.
 
  • Like
Reactions: zkap
E06E0658-92F2-4C42-9766-C1B49E2C3592.jpeg
 
People who ask researcher to read Apple FAQ, must be blind and not understand anything. Even with the FAQ, Apple still need to verify the photo detected by CSAM physically to ensure the accuracy and verify the algorithms. So where is the privacy at that point? If the image is right, it's fined but if wrong? Apple worker get watch random photo for free.

It not just about Apple worker, CSAM technology not been build by Apple but library from 3rd parties. If there is a bridge between two services, there is a hole.
No what you’re stating is Apple says they will do this. Apple does not need to verify anything. Apple says they will verify blah blah blah. There is no need. If X country says to Apple you will give us access to Mr. Smith’s iPhone then Apple will do it. Apple has in the past stated they will comply with any country’s legal requests
 
  • Like
Reactions: BurgDog
No if you did not upload them to iCloud they had no way to access them remotely on your iPhone. iMessages were also encrypted and they had no way to access them unless you put them on iCloud. This is not me speculating but rather fact because the US government has demanded Apple give them access to iPhones that were locked and Apple stated they were unable to do so.

This measure is only Applicable to photos uploaded to the iCloud.
 
It’s pretty quiet from the “This is Apples platform. They can do whatever they want. If you don’t like it, you can buy an Android” crowd 🤣
 
  • Like
Reactions: 09872738
This measure is only Applicable to photos uploaded to the iCloud.
I don’t agree with this. Apple says they will not scan photos if iCloud is turned off but if they can scan photos with iCloud turned on then they can scan those same photos with iCloud turned off.

I think you’re confusing what Apple says they will do with what Apple is capable of doing. Those are two different things. Apple may be may have all the intention to do what they claim they’re going to do if you believe they’re being honest. The problem is Apple will comply with the demands of any country they do business in. If they have the capability to do something even it’s not something they say they’re going to do but just having that capability means they will do it when a Country demands they do so.

I think I mentioned it in earlier post but the FBI demanded Apple unlock certain iPhones. They didn’t and I think still don’t have that capability. If they had that capability even if it was for a different purpose say for instance they had the capability to unlock your iPhone if you forgot the pin number. It’s the same here while Apple may honestly intend this for CSAM (which I don’t believe) it will be used for other things by the demands of certain countries.

Don’t confuse what they’re saying they’re going to do with what capabilities they have. Right now after this update they will have the capability to remotely access photos and messages on your iPhone. It’s possible they have more access but we don’t know this. How they use that access is going to depend on the situation.
 
Screw them - I'm going to save appx £1000 annual expense and get a feature phone
Android may be better for privacy if Apple moves forward with this…. I mean they do track you but as far as we know there is no spyware built in to the code looking for illegal activity and you can semi lock down android to be less invasive… I’m considering it if I have to leave apple …. I’m online a lot so I have to evaluate the least harm, just because Apple crossed the redline does not mean google will follow… watching
 
It’s pretty quiet from the “This is Apples platform. They can do whatever they want. If you don’t like it, you can buy an Android” crowd 🤣
Well android is an exactly isn’t exactly private or secure but I don’t know if Google can remotely access an android phone. I know for sure that is nowhere near as secure as iPhone when it comes to unlocking. I think most Apple fans are not happy with this situation but there’s not much they can do. This is what happens when you have a duopoly. Your choices are Apple or Google. Right now without any real competition Apple can just say whatever we’re doing this and if you don’t like it, we don’t care. Some hard-core Apple fans will defend Apple because they just love Apple that much but even though I’m a big Apple fan I can’t really defend this. It’s clearly disingenuous as to what they’re saying. If they just wanted to scan what’s being uploaded to iCloud they could’ve done this without needing remote access to your iPhone. They’re doing less because of government pressure to install a back door so that’s exactly what they did.

I understand their situation but it’s not good for the consumers and especially not good for people living in countries where the government is a bit oppressive. This system will be used to violate peoples human rights. It’s kind of ironic that the company that claims to be for human rights is about to start helping governments violate them
 
Android may be better for privacy if Apple moves forward with this…. I mean they do track you but as far as we know there is no spyware built in to the code looking for illegal activity and you can semi lock down android to be less invasive… I’m considering it if I have to leave apple …. I’m online a lot so I have to evaluate the least harm, just because Apple crossed the redline does not mean google will follow… watching
I don’t think android will be any more private. How do you know Google hasn’t done this already and haven’t told anyone. I think Apple was just a bit scared to do it in secret. I’m sure they thought about it but figured it would really bite them in the rear later if they did this in secret and got caught somehow. I mean as to my personal choice of a phone it does make me feel that Apple or the iPhone isn’t the only choice anymore since it’s not secure but that doesn’t necessarily mean I’m going to jump to android. It’s an alternative now versus before it wasn’t.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.