Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
66,420
35,534


Private Cloud Compute is a cloud intelligence system that Apple designed for private artificial intelligence processing, and it's what Apple is using to keep Apple Intelligence requests secure when they need to be processed in the cloud.

Apple-Intelligence-General-Feature-2.jpg

Apple promised to allow security and privacy researchers to verify the end-to-end security and privacy promises that Apple made with Private Cloud Compute, and today, Apple made its Private Cloud Compute Virtual Research Environment (VRE) and other materials publicly available to all security researchers.

Apple has a Private Cloud Compute (PCC) Security Guide that details all of the components of PCC and how they work to provide privacy for cloud-based AI processing. Apple released the source code for select components of PCC that help implement its security and privacy requirements, which allows for a deeper dive into PCC.

apple-private-cloud-compute-security.jpg

The Virtual Research Environment is a set of tools that lets researchers perform their own security analysis on PCC using a Mac. The VRE can be used for inspecting PCC software releases, verifying the consistency of the transparency log, booting a release in a virtualized environment, and modifying and debugging PCC software for deeper investigation. The VRE can be accessed in the macOS 18.1 Developer Preview and can be used with a Mac that has an Apple silicon chip and 16GB+ unified memory.

Along with these tools, Apple is expanding its Apple Security Bounty to include rewards for vulnerabilities that demonstrate a compromise of the fundamental privacy and security guarantees of Private Cloud Compute. Security researchers who locate a vulnerability can earn up to $1 million.

Article Link: Apple Shares Private Cloud Compute Virtual Research Environment, Provides Bounties for Vulnerabilities
 
I'm always curious how many of these bounties are actually collected? One time I sent an iPad OS Lock Screen bypass vulnerability to Apple and nobody ever responded. Feel like that should've been worth something. For how huge some of these vulnerabilities would impact their public image if they were released, you would think Apple would be interested in paying more for them.
 
  • Like
Reactions: Delivered
Apple:

Physical or internal access: vulnerabilities where access to internal interfaces enables a compromise of the system.
This is for security researchers to discover vulnerabilities that allow outsiders access. I'm talking about what Apple does with your data. There's nothing stopping them from having code we can't see that collects data from users.

Physical access refers to researchers exploiting physical security (like phyiscal data center access). There's no way to Apple to prove they aren't collecting any data without an external audit of their systems.
 
If you have proof that's true, let's see it. In comparison to Proton, they have had outsourced firms inspect their servers to verify they're statement of end-to-end encryption is true. Apple has done 0 outsourced audits of their servers.

Right. That's because none of this stuff is ready to go yet. Proton's products are ready to inspect. Apple's aren't. You don't check the doneness of a steak before it hits the grill do you?

You don't have to hear it from me though. You're welcome to read Apple's documentations and promises at that link I provided, but I get the feeling that you're more interested in being skeptical than in being informed.
 
If you have proof that's true, let's see it. In comparison to Proton, they have had outsourced firms inspect their servers to verify they're statement of end-to-end encryption is true. Apple has done 0 outsourced audits of their servers.
This is quite better, anyone can verify the claims - virtual server Mseries with 16GB is needed: MacStudio will do. Run it yourself.

This is quite a bold step, now iCloud next please.
 
  • Like
Reactions: UpsideDownEclair
From a security perspective, the only way to 100% prove transparency is having an outside firm perform and publish an audit on their internal servers.

Which will never happen
Willing to bet they will. Most vendors I work with all the way up to google have pen test docs for actual vendors to access. Bet that Apple has these as well for their enterprise clients.
 
Last edited:
  • Like
Reactions: name99
I'm far from an expert is such things, but in my rudimentary understanding, the only way to gain and maintain anything approaching trust in such systems is for it all to be permanently open source? Of course that's not going to happen, and even if it did, I still wouldn't touch anything 'AI' with someone else's bargepole. Because, well, come with me if you want to live etcetera.
 
Right. That's because none of this stuff is ready to go yet. Proton's products are ready to inspect. Apple's aren't. You don't check the doneness of a steak before it hits the grill do you?

You don't have to hear it from me though. You're welcome to read Apple's documentations and promises at that link I provided, but I get the feeling that you're more interested in being skeptical than in being informed.
If you ask a robber coming to rob your house if they will rob your house... will they say yes?
Apple's documentation is just that. It's not a proof and they can do whatever they want with our data behind their closed off code. For the moment this is all based on trust and the verification method in place is only about finding security breaches.
 
If you ask a robber coming to rob your house if they will rob your house... will they say yes?
Apple's documentation is just that. It's not a proof and they can do whatever they want with our data behind their closed off code. For the moment this is all based on trust and the verification method in place is only about finding security breaches.

Lest we forget, this is the same company that got rumbled spying on push notification metadata for the feds, and the excuse was 'they told us not to tell anyone'. A digression, yes, but the point being that "trust" in 2024 technology firms is the exemplar of naivety.
 
Lot of sceptici. Working for the competition, look the premise is simpel, the images are signed so Apple cannot change one bit after the audit, if they do it will be noticed in an instant, it would be devastating to their Privacy and security stance. Now having said that, will there be issues, of course, made by humans so expect some security issues that will be discovered even after they went into production, and they will be fixed and audited, rinse and repeat …
 
From a security perspective, the only way to 100% prove transparency is having an outside firm perform and publish an audit on their internal servers.

Which will never happen

I'm talking about what Apple does with your data. There's nothing stopping them from having code we can't see that collects data from users.

If you have proof that's true, let's see it

It is never possible to prove something like this absolutely. Security is almost always relative in practice, and some trust at some levels are needed, be it hardware or software or often both. One could just argue "Well, Apple set up fake servers for people to audit, but ACTUALLY they use your data" etc. etc., it's never gonna be absolute.
 
It is never possible to prove something like this absolutely. Security is almost always relative in practice, and some trust at some levels are needed, be it hardware or software or often both. One could just argue "Well, Apple set up fake servers for people to audit, but ACTUALLY they use your data" etc. etc., it's never gonna be absolute.
Indeed an the earth is flat 🤷‍♂️ (you do realize that if they would do that it’s just a matter of time that this will be discovered … and billions or maybe trillions of market cap are gone, there is absolutely zero incentive to claim this if it is not true, on the contrary … or you can tell me what would be the benefit / reward be for Apple to deceive anyone in that way, if so they just could have said Private cloud and be done with it)
 
It's nice to see all the people with degrees in math and CS, and lots of real-world security experience, all agree that Apple's getting this wrong, that you can't build a verifiable setup, that you'll still have to trust them (and you shouldn't), etc.

Oh. Wait. You *don't* actually have advanced degrees in the relevant subjects? You haven't actually read all of Apple's PCC docs to understand what it is they're promising and what they're claiming is verifiable? You don't do security for a living (or maybe, you do, but really shouldn't)?

Sigh.

This stuff is *hard* and *complicated* and worthy of careful thought and a concerted effort to find weak spots. Handwaving it as fake or insufficient based on a lack of knowledge or understanding doesn't do you or anyone else any favors.

One of the biggest problems (probably the biggest) in security is the human side of things. How do you get people not to do stupid things? How do you arrange things so a not-stupid person with no special security expertise can make not-stupid choices? There's no perfect broadly applicable answer to that, unfortunately, but one way to do better is to not have tons of confusing and false information out there.

That means, every single one of you spouting off about this without being expert and fully up-to-date on the specifics of the PCC implementation is doing damage to every other person reading your text. Maybe think about that before posting next time.
 
  • Like
Reactions: name99
Broadly, most of the arguments here come down to "PCC is just marketing, you'll still have to trust Apple". Some slightly more sophisticated hot takes are along the lines of "if you don't have physical access to the hardware, you have nothing".

It's not that simple. If you haven't read up on it IN DETAIL you don't know enough to have a valid opinion.

FWIW, in the actual real world, Apple has a shot at doing something really big and really good here. Or they could totally screw it up and fail. My money is on them succeeding, but the whole point of what they're doing is you don't have to rely on my opinion, or any other single person's. If you don't know why that's true, then again, you don't know enough to have a valid opinion.

They might screw up. They way to know that though is not to carry around your skepticism and assumptions. They are giving visibility into their setup. If you want to have an opinion, you have to actually look through the window. And yes, that takes time and effort. It should.
 
  • Like
Reactions: name99
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.