Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This has been a rising trend among all companies. This is not an Apple problem, this is a “the opaque “national security” mechanisms we put in place as a nation has opened up an easy vector of attack” problem.

Basically the over the top secrecy and spying done behind secretive “National Security Letters” (Google it) has left companies with no immediate way to know whether these requests are real or not.
This is a little different, it's emergency requests, not national security related. National security letters they can just call the NSA and ask if they sent a request. Unless someone can hijack the NSA phone line that isn't a problem, what is a problem is when there's thousands of law enforcement agencies and multitudes of phone numbers and email addresses to deal with.
 
  • Like
Reactions: turbineseaplane
I guess with the Pedogard feature coming up they can forge hashes and scour iCloud for data to steal once it shows up flagged.
 
This has been a rising trend among all companies. This is not an Apple problem, this is a “the opaque “national security” mechanisms we put in place as a nation has opened up an easy vector of attack” problem.

Basically the over the top secrecy and spying done behind secretive “National Security Letters” (Google it) has left companies with no immediate way to know whether these requests are real or not.

Again, this is not an Apple problem: https://krebsonsecurity.com/2022/03...of-subpoena-via-fake-emergency-data-requests/

Its an Apple problem when they put in the Pedogard backdoor and now hackers start using that feature to scour for specific target data.
 
When hackers pose as law enforcement, that's a problem for everyone (including Apple).

As for willingly turning over your private data to Facebook...yawn. We all should have gotten the memo by now.
 
  • Like
Reactions: msackey
I guess with the Pedogard feature coming up they can forge hashes and scour iCloud for data to steal once it shows up flagged.
Eh, you should probably think this through. You're saying someone will use your iCloud account to upload images in order to... get data from your iCloud account? If you can access the iCloud account you can access the iCloud account.

And to be clear, there is no such thing as "forged hashes". Either the hash value is in the database or it isn't. No one but CMEC (Center for Missing and Exploited Children) can add or subtract or modify values. Once something gets reported it's easily confirmed to be in the database or not, in like 2 microseconds. False positives, now that is the real thing that experts are worried about, an image can be sufficiently similar enough to be linked to a hash in the database, and still not be CSAM (Child Sexual Abuse Material). Usually they're generated intentionally to trigger these systems. But a human is supposed to check that the image is in fact CSAM or not. So false positives presumably can be checked and found to be not CSAM and no further action taken. It'd be a bigger issue for someone with access to your iCloud to upload actual CSAM and then trigger a law enforcement action on you, but that is a possibility today as it is and is a possibility with most cloud services as well as they're all scanning your images for CSAM.
 
"...data captured show a folder called "apple-health-app."

Does this mean someone will be able to see my weight and my horrible heart rate when I climb a flight of stairs!
I trust they will be howling with laughter. ;)
 
So I've never read all the Terms and Conditions etc., but could somebody please set me straight. I thought literally E-V-E-R-Y-T-H-I-N-G I stored in Apple's world was protected from being handed over to authorities. I always cite that example of Apple not handing over the keys to some suspected terrorist's iPhone in CA all those years ago.
 
This is a little different, it's emergency requests, not national security related. National security letters they can just call the NSA and ask if they sent a request. Unless someone can hijack the NSA phone line that isn't a problem, what is a problem is when there's thousands of law enforcement agencies and multitudes of phone numbers and email addresses to deal with.
The DHS was supposed to coordinate all of this kind of thing, that’s not how it works in practice.

It’s the opaqueness that leaves companies in the lurch. As even if this was a case of a standard NSL, companies aren’t allowed to talk about it, which I imagine means internally these things get segmented off as well.
 
This has been a rising trend among all companies. This is not an Apple problem, this is a “the opaque “national security” mechanisms we put in place as a nation has opened up an easy vector of attack” problem.

Basically the over the top secrecy and spying done behind secretive “National Security Letters” (Google it) has left companies with no immediate way to know whether these requests are real or not.

Again, this is not an Apple problem: https://krebsonsecurity.com/2022/03...of-subpoena-via-fake-emergency-data-requests/

most sensible post this entire thread will generate. Anyone stating "lol apple bad" or "lol apple incompetent" is just making silly claims. InfoSec is ever-changing, ever-evolving. Policies will be put in place that will change this. Betterment is made via mistakes. Life is never foolproof. Just the way of the world.
 
  • Like
Reactions: cyanite and msackey
So I've never read all the Terms and Conditions etc., but could somebody please set me straight. I thought literally E-V-E-R-Y-T-H-I-N-G I stored in Apple's world was protected from being handed over to authorities. I always cite that example of Apple not handing over the keys to some suspected terrorist's iPhone in CA all those years ago.
The phone, no. If there is a properly executed warrant for iCloud data, Apple will cooperate by giving access to that.

That’s different from on device security, which Apple has no ability to break.
 
Without a backdoor and this is managed to happen..
True. But also in a sense, the backdoor is themselves because they (Apple) are the gatekeepers.

Like someone said: if Apple encrypts the data and don't have the code to the data, essentially locking themselves out of it, this issue wouldn't exist. However, that also means Apple couldn't help you recover your data.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.