Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's weird, people have been complaining for years that Apple is late in AI, that Siri is dumb and needs AI, that AI features on other devices like Samsung are so cool compared to Apple.

ChatGPT had the fastest growing user base in history, and everybody seems to love generative AI for everything including writing, photo editing or image generation.

But now that Apple introduced these feature, all of a sudden everybody hates AI 🤔

Those who are complaining aren't customers of OpenAI. This may surprise you but many on macrumors run their AI models locally to avoid sending private data to big tech.
 
Why…… is the most valuable tech company in the world incapable of creating their own ai?

Why…. isn’t anyone asking this question?
Their devs are still WFH, they feel like it is better for them to freeride and use someone’s else stuff than waste time developing (and then bug testing) their own. Not saying they won’t try in 5 years from now perspective, later it would be useless
 
  • Haha
Reactions: Tagbert
I want a system-wide option to disable external LLMs, rather than a prompt every single time.
 
As long as it's "opt-in", "None" is one of the available selections, and there's a "stop bugging me about an external AI system" checkbox, I'll be happy.

Will be interested in the security evaluations that test that "off device" lookups do not occur when they are turned off.
 
As long as it's "opt-in", "None" is one of the available selections, and there's a "stop bugging me about an external AI system" checkbox, I'll be happy.

Will be interested in the security evaluations that test that "off device" lookups do not occur when they are turned off.
I’m interested as well. Apple did everything but outright say “I dare you to say we’re lying” in how much they emphasized the security aspects they’ve built in.

Given how under the microscope everything at Apple is from the press (rightfully so), they mean it.
 
  • Like
Reactions: Tagbert
Why does Apple need to provide their own advanced language model? They don’t need to do everything. They never made a search engine for example.

They don’t need to develop their own if they are happy with canned software and willing to pay the extraordinary cost for a vendor to customize it. Apple Intelligence, I guess.
 
Which giant corporation do you want to give you answers? I guess it's just basically search. But if it really expands would love to see some open source models become available.
 
It’s an AI party! But Apple is protecting your privacy, honest..

If they are going to rely on everyone else’s AI systems then their is NO reason why those can’t run on older devices as that is not a requirement of ChatGPT or Googles system.
 
It’s an AI party! But Apple is protecting your privacy, honest..

If they are going to rely on everyone else’s AI systems then their is NO reason why those can’t run on older devices as that is not a requirement of ChatGPT or Googles system.
Did you even pay attention during the keynote?
 
  • Like
Reactions: Tagbert
While Cook‘s vision had him busy pouring money down the Apple Car and VP drains AI left them in the dust. Apple is so far behind the curve they have to resort to buying canned AI.
Apple has been utilizing machine learning for years. They just never called it AI. Now they’re leveraging that growth and experience in the ML sector to debut Apple Intelligence, which is one of the nicest implementations of AI I’ve seen so far. And everything they can do on device (for chips that support it), they are doing on device. For broader queries they’re giving us the option to connect to ChatGPT and eventually other options as well. This isn’t putting them behind the curve, that’s just being smart about using services that round out the feature set. I think anyone looking at this and thinking that connecting to ChatGPT’s API for a few offloaded queries is a sign of failure has no idea how software development works.
 
  • Like
Reactions: Tagbert
While Cook‘s vision had him busy pouring money down the Apple Car and VP drains AI left them in the dust. Apple is so far behind the curve they have to resort to buying canned AI.
Apple, Microsoft, Google, Samsung, Oppo… everybody was caught out by the hype created by OpenAI with Chat GPT. They’re all catching up and throwing money at it.
 
  • Like
Reactions: Tagbert
Yes, why? Do you think your data is secure when using chatGPT or other AI services? Apple is the company with what over a million apps on its store, the majority of which are ‘free’ and data mine you.
I think Apple, a $3 Trillion company with a massive target on its back, and providing researchers the ability to verify the claims, is not asking for a massive and easy fraud lawsuit when they make their security claims (explicitly, loudly, and repeatedly) in the Keynote.

ChatGPT doesn’t get your information through Private Cloud Compute.

However, if you paid for ChatGPT (for some reason) and sign into your account, you probably are handing over your data.
 
I think Apple, a $3 Trillion company with a massive target on its back, and providing researchers the ability to verify the claims, is not asking for a massive and easy fraud lawsuit when they make their security claims (explicitly, loudly, and repeatedly) in the Keynote.

ChatGPT doesn’t get your information through Private Cloud Compute.

However, if you paid for ChatGPT (for some reason) and sign into your account, you probably are handing over your data.

Actually it’s external sources only check Apples servers, NOT Microsoft’s or anyone else’s, it will also ask you first if you want to use other AI services and I bet that comes with a privacy warning.
 
  • Like
Reactions: Tagbert
Actually it’s external sources only check Apples servers, NOT Microsoft’s or anyone else’s, it will also ask you first if you want to use other AI services and I bet that comes with a privacy warning.
Maybe I misunderstood, but Apple lays out the process starting at 5:17 in the State of The Union address:


“Still, there are some more advanced features that require larger models to reason over more complex data. So we've extended Apple Intelligence to the cloud with Private Cloud Compute to run those larger foundation models. Because these models process users' personal information, we needed to rethink Cloud Compute and extend the privacy approach of our devices to servers. Private Cloud Compute is designed specifically for processing AI, privately. It runs on a new OS using a hardened subset of the foundations of iOS, based on our industry leading operating system security work. To mitigate entire classes of privacy risks, we have omitted features that are not strictly necessary in a dedicated AI server, such as persistent data storage. On top of this secure foundation, we have completely replaced the tools normally used to manage servers. Our tooling is designed to prevent privileged access, such as via remote shell, that could allow access to user data. And finally, Private Cloud Compute includes a full machine learning stack that powers intelligence. The result is an unprecedented cloud security foundation based on Apple Silicon. It starts with the Secure Enclave to protect critical encryption keys. Secure Boot ensures the OS is signed and verified just like on iOS, Trusted Execution Monitor makes sure that only signed and verified code runs. And attestation enables a user's device to securely verify the identity and configuration of a Private Cloud Compute cluster before ever sending a request. For each request, a user's device establishes an end-to-end encrypted connection with a Private Cloud Compute cluster. Only the chosen cluster can decrypt the request data, which is not retained after the response is returned and is never accessible to Apple. But we're going even further: we're committing to making virtual images of every production build of Private Cloud Compute publicly available for inspection by security researchers, so they can verify the promises that we're making, and findings will be rewarded through the Apple Security Bounty. Second, we're making sure a user's device will only communicate with Private Cloud Compute clusters that are running a signed build that has been publicly logged for inspection. This is verified with the strong cryptographic attestation mechanisms in Apple silicon. We believe this is the most advanced security architecture ever deployed for cloud AI compute at scale. Apple Intelligence is the personal intelligence system that brings this all together. It includes an on-device semantic index that can organize personal information from across apps as well as an App Intents Toolbox that can understand capabilities of apps and tap into them on a user's behalf. When a user makes a request, Apple Intelligence orchestrates how it's handled either through its on-device intelligence stack or using Private Cloud Compute. And it draws on its semantic index to ground each request in the relevant personal context and uses its App Intents Toolbox to take actions for the user. It's specialized to be absolutely great at the features it enables. It's built with the best possible performance and energy efficiency, and of course, it's designed around privacy and security from the ground up. And that's Apple Intelligence”
 
Maybe I misunderstood, but Apple lays out the process starting at 5:17 in the State of The Union address:


“Still, there are some more advanced features that require larger models to reason over more complex data. So we've extended Apple Intelligence to the cloud with Private Cloud Compute to run those larger foundation models. Because these models process users' personal information, we needed to rethink Cloud Compute and extend the privacy approach of our devices to servers. Private Cloud Compute is designed specifically for processing AI, privately. It runs on a new OS using a hardened subset of the foundations of iOS, based on our industry leading operating system security work. To mitigate entire classes of privacy risks, we have omitted features that are not strictly necessary in a dedicated AI server, such as persistent data storage. On top of this secure foundation, we have completely replaced the tools normally used to manage servers. Our tooling is designed to prevent privileged access, such as via remote shell, that could allow access to user data. And finally, Private Cloud Compute includes a full machine learning stack that powers intelligence. The result is an unprecedented cloud security foundation based on Apple Silicon. It starts with the Secure Enclave to protect critical encryption keys. Secure Boot ensures the OS is signed and verified just like on iOS, Trusted Execution Monitor makes sure that only signed and verified code runs. And attestation enables a user's device to securely verify the identity and configuration of a Private Cloud Compute cluster before ever sending a request. For each request, a user's device establishes an end-to-end encrypted connection with a Private Cloud Compute cluster. Only the chosen cluster can decrypt the request data, which is not retained after the response is returned and is never accessible to Apple. But we're going even further: we're committing to making virtual images of every production build of Private Cloud Compute publicly available for inspection by security researchers, so they can verify the promises that we're making, and findings will be rewarded through the Apple Security Bounty. Second, we're making sure a user's device will only communicate with Private Cloud Compute clusters that are running a signed build that has been publicly logged for inspection. This is verified with the strong cryptographic attestation mechanisms in Apple silicon. We believe this is the most advanced security architecture ever deployed for cloud AI compute at scale. Apple Intelligence is the personal intelligence system that brings this all together. It includes an on-device semantic index that can organize personal information from across apps as well as an App Intents Toolbox that can understand capabilities of apps and tap into them on a user's behalf. When a user makes a request, Apple Intelligence orchestrates how it's handled either through its on-device intelligence stack or using Private Cloud Compute. And it draws on its semantic index to ground each request in the relevant personal context and uses its App Intents Toolbox to take actions for the user. It's specialized to be absolutely great at the features it enables. It's built with the best possible performance and energy efficiency, and of course, it's designed around privacy and security from the ground up. And that's Apple Intelligence”

None of that goes in depth about how Apples servers will integrate with Microsoft’s or Googles etc?
 
None of that goes in depth about how Apples servers will integrate with Microsoft’s or Googles etc?
Are you under the impression that Apple is not using their Private Cloud Connection when it comes time to integrate with CoPilot or Gemini?

Why would they go through all this work just to connect to OpenAI, and not do the exact same thing for other AI vendors?

I’m really not understanding what you’re asserting here?
 
Are you under the impression that Apple is not using their Private Cloud Connection when it comes time to integrate with CoPilot or Gemini?

Why would they go through all this work just to connect to OpenAI, and not do the exact same thing for other AI vendors?

I’m really not understanding what you’re asserting here?

Because they have to give information to the other AI systems for them to work, so they cannot guarantee what is done with that information. What if location data is needed? An AI system can’t give accurate answers or information without knowing certain aspects.
In that text no where as I said does it detail how Apple is integrating its servers with other systems, and as they advise you will has asked first if you want to use chatGPT etc I would guess they cannot control what happens to your data once passed across.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.