Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As of right now, that is just gas pulled from Johns' butt

He's just about as likely to be correct as your prediction.

Keep in mind that Gruber is so far up Apple's rear end he needs a flashlight.

He's made an entire nearly 2 decade career about of pontificating about Apple.
To say he's biased here is incredibly generous.

He's attacked apple plenty of times, recently about the App Store: https://daringfireball.net/2020/09/widgetsmith_bunco_squad
 
Nope. I can see the headlines "Apple now scans every single photo you upload to iCloud"
Yes, there would be a few headlines. But the outrage would easily subside since it's not on-device, and everyone else does it too, for legal reasons. In fact, Apple has already done that: https://www.ubergizmo.com/2020/01/apple-scan-icloud-photos-child-abuse/
On the subject of what Apple might do: John Gruber from Daring Fireball suggests this is Apple's way of enabling end to end encryption for photos and backups too by giving law enforcement only data that they can extract from a high threshold of detections. In return customers get end to end encryption. This is far better IMO than the government being able to come in and decrypt all of my iCloud data that's currently happening today.
If you are scanning the photos, it is a mockery of the word to call it E2EE. I understand that "the cloud" and the web are not secure or private. I can use them as a tradeoff for convenience. I shouldn't have to make those tradeoffs on my own personal hard drive.
No point in guessing what Apple might do. It could be good or bad. You could be wrong or John Gruber could be wrong.
I fail to see how this ultimately ends up in a good direction.
 
From that Verge article I posted above: "CSAM is illegal and abhorrent. But as the open letter to Apple notes, many countries have pushed to compromise encryption in the name of fighting terrorism, misinformation, and other objectionable content. Now that Apple has set this precedent, it will almost certainly face calls to expand it. And if Apple later rolls out end-to-end encryption for iCloud — something it’s reportedly considered doing, albeit never implemented — it’s laid out a possible roadmap for getting around E2EE’s protections."

It's that very last line that I take issue with when someone makes the argument that the way they're doing this gives us more E2EE. That's technically true. But it's penny wise and pound foolish. They're potentially giving us E2EE while providing a way (albeit difficult with many steps) to get through that E2EE if need be.

As opposed to no E2EE today...?
 
As opposed to no E2EE today...?

Our devices have E2EE on them.

The cloud does not.

We gain cloud E2EE, while also gaining a "loophole" (too strong of a word...) for the E2EE on our devices.

I don't expect near the security and privacy in the cloud that I do on my own device, so I'm not ok with this trade-off.

Edit: And if you (or anybody else) is ok with that trade-off, that's perfectly fine. But those that are not ok with that trade-off have just as valid of a perspective as well.
 
Yes, there would be a few headlines. But the outrage would easily subside since it's not on-device, and everyone else does it too, for legal reasons. In fact, Apple has already done that: https://www.ubergizmo.com/2020/01/apple-scan-icloud-photos-child-abuse/

If you are scanning the photos, it is a mockery of the word to call it E2EE. I understand that "the cloud" and the web are not secure or private. I can use them as a tradeoff for convenience. I shouldn't have to make those tradeoffs on my own personal hard drive.

I fail to see how this ultimately ends up in a good direction.

Are you using iCloud photos today? If so, then this "evil Apple future of E2EE iCloud Photos with on device scanning" is *not* worse than today's "i have no E2EE in iCloud Photos so i'll let Apple and the government see my photos" as so far you aren't practicing what you want the iPhone to be.

If you aren't using iCloud photos today, then it's likely you're in the extreme minority in which case the iPhone will no longer be the phone for you.
 
Their is no point in E2EE if it is backdoored to let people in.

Sure there is. This supposed backdoor involves "high threshold of matches" and "visual derivatives" that need to take place which is infinitely better than zero E2EE iCloud Photos today where the government can already grab willy nilly whatever they want from me.
 
Are you using iCloud photos today? If so, then this "evil Apple future of E2EE iCloud Photos with on device scanning" is *not* worse than today's "i have no E2EE in iCloud Photos so i'll let Apple and the government see my photos" as so far you aren't practicing what you want the iPhone to be.

If you aren't using iCloud photos today, then it's likely you're in the extreme minority in which case the iPhone will no longer be the phone for you.
I do use iCloud photos currently, and I don't get angry that they are not E2EE because it is not my computer. On the other hand my iPhone is my most personal device, and belongs to me, under no circumstances should it be preemptively scanned.
 
No. Just, no. Factually wrong. Derivative is *NOT* a copy. It is *derived* from the source.



If so, you never had any privacy to begin with considering Spotlight on Mac and iPhone scans your entire device.



Threshold is enough to result in 1 in 1 trillion odds of an error happening.
So even if it is just 2 photos that crosses the threshold, the accuracy is so good that it'll virtually never happen to most customers in their lifetime.



Spotlight already does scan for it. Photos app already scans on device facial recognition. You had no privacy to begin with. This new feature changes nothing with regards to privacy.

I'm done arguing with you. You've already made up your mind to believe your false statements so what's the point in continuing this conversation with you?
You are totally missing the point. They are going through our photos without our permission and not for the purpose of providing us a service we want. They are doing it solely to report us to the authorities and to censor and control our private lives and to force their view of the world on our private lives.

derivative is another vague marketing term to get people like you who don’t know anything to feel warm and cosy. It can be whatever apple wants it to be. A derivative can be absolutely anything. Just like all the other misleading marketing terms - hashing, secret threshold, list of hashes. This is a masterclass in double speak.

btw, I have a bridge for sale. It’s a bargain. You should buy it.
 
It is still a backdoor that Apple has the ability to change and bypass the E2EE encryption which defeats the purpose of it.

No. This system only lets others see the derivatives of matched photos. The only way Apple can give full access of your original photo library to the government would be to remove this system all together which is what we have today on iOS 14.
 
  • Like
Reactions: januarydrive7
Just because there are hurdles to go over doesn't make the "backdoor" (though I don't consider it a true backdoor) any less of an issue.

It does when the backdoor only allows viewing of visual derivatives of matched photos as opposed to the original photo. Only way for the government to get full access to your entire library of original photos would be to remove the system which is what we have today on iOS 14.
 
Here, have this completely unrelated photo of Steve Jobs’ widow (and owner of The Atlantic) Laurene Powell-Jobs hanging out with longtime Epstein associate Ghislaine Maxwell.

B7534364-73B5-4FFE-9A43-6B854F4E33B2.jpeg


Honestly **** Apple and **** the hypocritical psychopathic billionaire class.
 
No. This system only lets others see the derivatives matched photos. The only way Apple can give full access of your original photo library would be to remove this system all together which is what we have today on iOS 14.
Apple has the ability to change the software that compares photo hashs and expand the software scope without the end user knowledge is the problem. I would not be surprised if CSAM has a separate update function the end user does not have access to that bypass's the need for iOS updates since the photo hash database would need to constantly be updated.
 
It already is since 2011 with facial recognition. It's doing this whether or not you use icloud
It's not using facial recognition to spy on me for the authorities, it's trying to identify photos as a service for me. If it tried to match the faces in my photos to the faces of say, missing persons in a database with the intention of reporting me to the police, then that would be a problem.
 
I meant "As opposed to no iCloud Photos E2EE today".

Right - did you see the rest of that comment?

Our devices have E2EE on them.

The cloud does not.

We gain cloud E2EE, while also gaining a "loophole" (too strong of a word...) for the E2EE on our devices.

I don't expect near the security and privacy in the cloud that I do on my own device, so I'm not ok with this trade-off.

Edit: And if you (or anybody else) is ok with that trade-off, that's perfectly fine. But those that are not ok with that trade-off have just as valid of a perspective as well.
 
  • Like
Reactions: huge_apple_fangirl
Apple has the ability to change the software that compares photo hashs and expand the software scope without the end user knowledge is the problem. I would not be surprised if CSAM has a separate update function the end user does not have access to that bypass's the need for iOS updates since the photo hash database would need to constantly be updated.
Apple themselves might not have bad intention. However, since they are relying a blackbox database, external parties like governments can enforce certain hashes based on the laws they have. Also, since there are humans involved (as Apple themselves said to do the reviews), they can be coerced. track record wise, Apple had no problems outsourcing confidential SIRI data to sub-contractors. The potential for problems are simply too much.

The main problem is the "holier than thou" mentality. Even Apple's own privacy head has the mentality of "just don't do illegal things," ignoring that the definition of legality can vary disproportionately between countries. It's a very ignorant statement, and extremely sad that a privacy person could say something like that.
 
Apple has the ability to change the software that compares photo hashs and expand the software scope without the end user knowledge is the problem. I would not be surprised if CSAM has a separate update function the end user does not have access to that bypass's the need for iOS updates since the photo hash database would need to constantly be updated.

What is sent and decrypted for Apple to see is "visual derivatives" OF the matched original images enough to check if the image is indeed of an underage child. They do not have access to the original image.

In order for access to a near identical images of ALL of your photo library, Apple would have to lower the threshold to near 0, increase leniency of fingerprinted images to match all of your images, rework the "visual derivative" algorithm to include full resolution and keep all the details. Doing all of this is essentially removing the system which is literally what we have in iOS 14. So Apple essentially did a full circle for absolutely no point if this was the case. Because of that, I don't see this as a possibility.
 
  • Like
Reactions: januarydrive7
What is sent and decrypted for Apple to see is "visual derivatives" OF the matched original images enough to check if the image is indeed of an underage child. They do not have access to the original image.

In order for access to a near identical images of ALL of your photo library, Apple would have to lower the threshold to near 0, increase leniency of fingerprinted images to match all of your images, rework the "visual derivative" algorithm to include full resolution and keep all the details. All of this is essentially removing the system which is literally what we have in iOS 14.
Apple has the ability to change those settings thru updates. In addition people are not going to be able to stay on iOS 14 forever but it will give people time to transition if Apple does not back down on this policy.
 
  • Like
Reactions: turbineseaplane
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.