Obviously- I’m not condoning the fact Apple don’t e2e iCloud backup, I really would prefer end to end encryption of all my stuff. It is however in keeping with most other none specialist cloud providers.
Hoping for change though.
I may be confused about either the technical nuance or corporate logic of this, but it seems reasonable to assume that the reason Apple makes a distinction with the iCloud backups is that they are exactly the sort of thing that a non-expert user (the vast majority) is going to desperately need, with no way to retrieve the key on their end, and not have access to. By maintaining the keys on Apple's end, it's possible for a user to retrieve their backed-up data after an account hack or poorly-prepared-for disaster. Were the key completely user-side, they'd be out of luck.
Contrast with e2e encryption on messaging and the other data Apple does encrypt e2e, which has no customer disadvantage--it
only prevents man-in-middle spying. Basically it seems like Apple is trading off between situations in which there is genuine advantage to less-secure encryption for "everyday user" functionality, and doing the best available in situations where they can without impacting those same users.
Notable in this is that technically proficient users who understand the potential risks of a non-user-key-encrypted cloud backup
can do a local backup, which is encrypted with a local key.
Why make a big deal out of being a researcher then do something flagrantly illegal?
But even if you don't trust them, each one has to do the calculus: Other people have the same device I have. I find a bug that Apple is willing to pay $500,000 for and can get the payout for immediately, legally, no questions asked.
Or I can try to find some very wealthy criminal or state actor who is willing to pay $2,000,000 for it, launder the money, probably quit my job because people are probably going to ask questions if I flaunt it, and my buyer is going to have to be okay with the risk that one of the other researchers isn't going to find the same bug tomorrow.
All of which is to say that an illegal buyer is going to have to be either extremely rich or extremely confident that you're better than the other researchers working on the same problem to be willing to pay big for it, and you're going to be under a lot more scrutiny if you suddenly get rich.
After thinking about this more, I realized that it's even more inherently "good" than I made out with this reasoning. Logic:
If I'm a bad-faith security researcher, I
really don't want to let the people I'm selling the hack I discovered know who I am. If they know who I am, they can both ruin my career and put me in prison by making public that I sold them a hack, without harming themselves. Not to mention that the Russian mob/government could also have me killed if they decide I know too much.
So I'm going to pretend to be an anonymous blackhat when I sell the hole I discovered on the dark web. At which point, why would I
not immediately turn around, report the bug to Apple, and collect the payout from them
too? If I covered my tracks well it'll just look like a coincidence, it insulates me from accusations that I found a bug and didn't disclose it, and I make even more money.
Basically, even a bad-faith public security researcher would presumably still report the bug to get paid twice for it, in which case the worst harm might be a short period between when I sold it illegally and cashed in publicly.