Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Add me to the list. I had a M1 Air with 8GB. If I had more than a web browser (with 5-10 tabs) and one or two other apps open (like Music, Discord for example), it would beachball occasionally and noticeably slow down due to memory pressure being high. I always had to keep everything to a minimum to keep below the ram limit to prevent swap and restart the computer at least once a week at least.

Since then I upgraded to a MBP M1 Max with 64GB of RAM (yeah a bit over kill with ram) and it's been a relief that I can actually have stuff open now without any slowdowns.

My Mom was another one who had an M1 Air with 8GB. She has a ton of tabs open at a time regularly and she'd get it to a point where it would actually freeze from over flowing page swapping. She has a MBA M2 16GB now and she hasn't had issues at all and still uses it identically.

Same here with my M1 mini. Logic Pro ran into paging on 8 but 16 put an end to that.

c'mon dudes, you're acting like having 5-10 safari tabs open AND two ram hogging apps open in the background, and/or running Logic Pro is average use. "Average" users check email, maybe watch YouTube, sort photos, send a few messages... hardly ever all at the same time, and all of which would probably do fine with 8GB of M series unified RAM.

"Average" users also don't usually hang out in Mac tech forums... none of us here would probably be satisfied with anything less than 16GB because we expect and need more performance out of our Macs.
 
Isn’t the point of the unified memory that the ram isn’t really as important as a solo artist?

Has the ram been a point of contention in the first avp that’s somehow limited its capability?

The fact that it's a $3500 monitor for one person, with max 2 hours battery life, that runs iPad apps is what limited its capability. Apparently monitors don't need much RAM.
 
they should have put at least 24gb for local llm usage on that thing. Maybe they come up with a small model that allows the user to navigate the interface with voice commands.
 
Given that the current macOS tahoe runs perfectly fine on 8 gb of ram (same with iPadOS) 16 is plenty of a buffer.


If you use local llms you would understand why them only including 16gb of ram is not ideal for future proofing especially at that price point.
 
A few things..
I don’t think the ram is separate on M series chips. So it’s not like “ram is cheap these days” is meaningful. The whole pricing thing is different to how it used to be.

You can’t really compare macOS and any of their mobile OS’s in terms of memory usage. Fundamentally macOS lets software do anything if it has rights. So you can run a server or a low level process. The job of macOS (or any PC OS) is to manage resources effectively to keep every job running. That’s why you see the beach ball all the time when your low on ram. It’s trying to keep that server process you have running and that ram eating monster JavaScript based web page going and splitting whatever resources it has left to keep it going, like it’s juggling plates or something.

The mobile OS primary job is to keep the core system running smoothly at all costs as it was essentially made for your phone. So from and emergency point of view you don’t want one app dragging down your ability to make an emergency phone call. So apps if they aren’t in view generally release their ram and when they come back in view they reload everything.

it’s got more relaxed over time to allow for more multi tasking but I think even downloading in the background in apps will stop after a while if your app is not the foreground task.

Anyways, all that is to say, 4 or 8gb on ipadOS is a lot different to the same memory on a MacBook in terms of performance. Just because memory is managed differently. And I imagine the Vision Pro is managing memory in a similar way.

I’ve noticed that the VP has a process quitting option that ipadOS/iOS doesn’t have. So it indicates it’s not quite as strict as them in terms of long running background tasks etc. But I can’t recall seeing any massive slow down in the overall os no matter what apps are running.

Mac is a whole different story memory wise. Apple definitely over egged the whole 8gb thing on the m1 MacBooks. It was a bad experience. But these modern mobile OS’s are quite different. I’m surprised Apple even bother talking about ram on these devices. They never used to. I think it was an easy marketing win as so many people are fixated on specs that it sells machines easily. But then they get themselves into a mess because they can’t effectively explain why memory doesn’t matter as much because if it doesn’t, why are they highlighting it?
 
  • Like
Reactions: lotones
This product is far too expensive.

For privacy reasons, I would rather watch NSFW VR movies with Apple than with Meta Quest. But the Vision Pro is only available in my country as a US import with limited features and is far too expensive.
 
Good to see that charger is still being a part of in box contents. But would like to see the new 40W charger being sold everywhere.
 
  • Like
Reactions: mganu
My brain read this like so ... seems about right 😂
It’s a small fraction of what’s available for the Quest 3, which has a massive amount of apps now. It’s a shame considering how much more powerful the VP is, but its stratospheric price means developers won’t invest in making apps for it when it has so few owners ( and therefore potential buyers of apps ) compared to the Quest.
 
  • Like
Reactions: lapstags
If you use local llms you would understand why them only including 16gb of ram is not ideal for future proofing especially at that price point.
Nah I have MacMini M4 Pro with 64GB that I load a 45GB Qwen, that runs just fine as Server to my 32GB MBP which I will be using via Screen Mirroring ON the AVP, which the point I am trying to make is: It's a DISPLAY! with some App capabilities, you don't run local LLMs that you want to really do heavy work on... plus Apple will have the smaller LLMs for local stuff!
 
they should have put at least 24gb for local llm usage on that thing. Maybe they come up with a small model that allows the user to navigate the interface with voice commands.

If you use local llms you would understand why them only including 16gb of ram is not ideal for future proofing especially at that price point.
Very true, however Apple's built in local LLM is designed to run on devices with 8 gb of RAM, with even the new pro ipads having 12 gb of RAM. If I wanted a more serious local LLM, which I do use, I have it running on my Mac Studio since it has way more RAM and an M4 Max which will likely smoke the normal M5 they used in this.

You also have to remember this is a computer strapped to your face, do you really want to run hardcore local llms? Your eyes would probably melt lol.
 
  • Like
Reactions: rezwits
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.