Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's all very clever that it can do it, but to what end? I keep hearing of a nebulous "AI Researcher" having a use for this, but to do what exactly? Running LLMs locally has fascinating potential, but who practically needs to run the gigantic model locally?
 
Seems in a few years individuals can run full AI models on machines for the price of a car. Seems there might be a wave of small businesses coming that offer AI as service with high privacy and security. Also, military applications of this in small local weapons systems will be big business (for good or bad, ugh...).
 
It's all very clever that it can do it, but to what end? I keep hearing of a nebulous "AI Researcher" having a use for this, but to do what exactly? Running LLMs locally has fascinating potential, but who practically needs to run the gigantic model locally?
Probably what Apple Intelligence was hoping to do...

...but Apple needs to more or less setup a RAG system on steroids, where while the systems can run externally are used normally, some commands can be executed locally. I'm sure they would love some sort of SDK for apps to provide additional context on the fly.

But god knows the level of hallucinations and temperature throwing things off. Intent analysis would have to be run locally to know what to call.
 
It's all very clever that it can do it, but to what end? I keep hearing of a nebulous "AI Researcher" having a use for this, but to do what exactly? Running LLMs locally has fascinating potential, but who practically needs to run the gigantic model locally?
Running a AI models to do market research, analyze health data of individuals or patient groups or populations, do patent searches, develop a business strategy, analyze peoples reactions to communications (political or just advertising), develop clinical development strategies, develop clinical trial protocols. All stuff I would do with an AI (current LLM's are not there yet but agentic models will be soon) and where I do NOT want anyone knowing what do and that I do it.

Edit: have an AI audit your organization to see if there are financial or legal or IP troubles. You want the result but you certainly don't want anyone knowing that you do the audit let alone finding issues.
 
Very cool.

For everyone that wants to use the latest models without giving up your data and without buying a machine like this… go spin up a model on Amazon Bedrock. It takes no time at all… and you can pick from whatever models you want (including the latest Claude).
 
448GB of VRAM, not virtual memory (V = Video)!

If virtual memory were a usable solution to anything in this particular computing space there would be nothing terribly special about this machine.
Yes. That should be corrected in the article. Was a bit confused when I read ”virtual memory”.
 
  • Like
Reactions: KeithBN
Maybe Apple could sell a dedicated LLM version of their macstudio to offer middle to small businesses their own localized AI server.

But it’s Apple… they probably come out with one when the market is saturated by competitors.
 
It's all very clever that it can do it, but to what end? I keep hearing of a nebulous "AI Researcher" having a use for this, but to do what exactly? Running LLMs locally has fascinating potential, but who practically needs to run the gigantic model locally?

And that's why Apple is investing $500 Billion in the US, to build an AI server manufacturing facility in Houston, and deploying the servers in data centers across the country.
 
I wonder how good this thing will do Video Ai stuff. Like Topaz new Starlight or Video Ai work.
 
Yes, but more 10 than 5.
If SanDisk's high-bandwidth Flash takes off, then it might be sooner rather than later. It'll allow LLMs to use cheap high-density flash memory instead of using expensive main memory.
 
  • Like
Reactions: Tagbert
What it does prove though is that the $500bn server centre for Skynet ChatGPT is largely a load of bollocks.
 
Maybe Apple could sell a dedicated LLM version of their macstudio to offer middle to small businesses their own localized AI server.

Since Apple is *already* making their own AI servers, this is a distinct possibility.

Whether or not they'll do it is another question. I think they will, but they have other issues to fix first. The optics of offering an AI server when Siri and Apple Intelligence are struggling would be poor.
 
Since Apple is *already* making their own AI servers, this is a distinct possibility.

Whether or not they'll do it is another question. I think they will, but they have other issues to fix first. The optics of offering an AI server when Siri and Apple Intelligence are struggling would be poor.
They could allow other “open source” systems run on it and earn from the hardware sold.

Apple has proven time and time again that they aren’t that good in software offerings.
 
Would it be safe to say that in 5-10 years a smartphone will be able to run a model like this internally and without the internet?
Not without some major technology breakthrough, no. Phones are not meant for this type of massive computing problem. They can run smaller models, but I would never, ever expect a small device such as a phone to efficiently run a large AI model. It makes more sense to offload this task on a server that are able to run models like this.
 
  • Like
Reactions: KeithBN
Wasn't a huge portion of that for training the models? Not using the models?
Without access to copywrited data, everything from blog posts and YouTube videos to literature and music to consumer they’ve reached the limits of what an LLM can do. They won’t admit this to Wall Street, if they do funding dries up and the lights go off because customers are not paying for it and probably never will.

The future is in distillation of models to fit on ever more efficient hardware, so we end up not with Skynet but rather the Computer from Star Trek.

If Apple do ever get the contextually aware Siri out the door running on-device there is no future at all in the centralised model.
 
  • Like
Reactions: G5isAlive
I don’t get all this AI nonsense and hoopla. It’s called critical thinking. Humans have been doing it for millions of years
 
  • Haha
Reactions: G5isAlive
Nope. Apple is way out ahead here.
What are you talking about? Do they have some alien tech others don't?
AMD's new desktops with 128GB of RAM, 96 of which can go to the APU, will start at around $2000. A Mac Studio with 96GB of RAM starts at twice as much.
Nvidia will offer a similar board soon. Intel is on it. I see no reason why they couldn't offer more RAM if that's where the market is going, unless they decide it's not the best strategy for people who also make dedicated pro GPUs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.