Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Windows/apps don't need to be constrained to a boxed display/window at all, especially with spatial computing. Let Mac apps and windows roam free like all other VisionPro apps. Either that or just allow resizable aspect ratios.
On a technical level, I’m not so sure how plausible that is. Remember, the Mac apps aren’t running on Vision Pro and have no conception whatsoever about what Vision Pro is. To the app’s perspective (and to SystemUIServer’s perspective or LoginWindow’s perspective), it’s just outputting to a display. It has no idea (or maybe only minimal idea) that the display is Vision Pro, it might as well be an HDMI connection. In order to, say, display Finder windows in Vision Pro, WindowManager (macOS’s answer to, say, twm or sddm used by KDE 5 on Linux) would have to be able to pass its individual windows as objects to Vision Pro that would then display on Vision Pro’s display. I’m not sure it could do that, it seems to have largely been designed for displaying windows on a locally connected output device. Apple has done somewhat similar things before; the closest analogy in the Apple world is either the TouchBar or first gen Apple Watch apps, where the UI executes on a separate device from the device running the backend code*. But that was strictly an opt-in thing for macOS (or iOS) applications. It’s likely that macOS support for Vision Pro would work something along those lines, and I’m not sure how many macOS exclusive developers are out there these days (if they’ve got the same app on, say, the iPad, you might as well just make a native Vision Pro app).

X11 was designed to be primarily a networked window server, its windows were meant to be drawn on an updated glass terminal with an X11 client running X11 events passed over the network from the server. In other words, X11 GUIs were intended to be run on separate machines from the servers hosting the applications (which is how the X11 clients on Android work [or Mocha X11 on iOS, for that matter] to this day, incidentally). WindowManager was probably designed to work on the local machine without any sort of remote output capability.

* On Macs equipped with a TouchBar, the TouchBar itself is basically an embedded Apple Watch, with a separate either S or T series chip (I can’t remember the exact details) as processor, while the application providing the functionality is either an Intel x86-64 processor or an Apple Silicon ARM processor.
 
What will the non Pro version be called? Vision Air? just 'Vision'?
Almost certainly Vision Air. Air is already seemingly Apple’s go to term for any “lighter and less featured than ‘pro' but more powerful than base model” (mostly the MacBook Air and the iPad Air), and it fits similar branding to the AirPods.
 
Mac, iPad and AVP all have the same M2 chip, but only the first is fairly unrestricted in functionality.

Compare how a MacBook Air can use an A2 without fans, while the Vision Pro needs air intake vents and, I think, fans to avoid thermally constraining the M2 processor. Thermal performance is obviously a limiting factor for the Vision Pro.
 
The current implementation, when connecting to an external Mac, gathers all the windows from all the displays on your Mac and moves them to the single 4K monitor and blacks out the Mac's display(s). When you end the connection, it puts the windows back where they were on your Mac before the connection (unless you moved the windows during your interaction). It does this fairly well. This is something I can work with for now and have spend many hours working this way, including editing code. I really enjoy it. Note, that for those who haven't tried this, using a connected Mac is not a gesture-driven interaction. You use your mouse and your Mac's keyboard (or another bluetooth keyboard) to control anything running on your Mac.

This is a V1 OS, not just V1 hardware. Apple has a solid reputation of upgrading the capabilities of Macs and iOS devices with regular OS updates, and they seem to be aware that this product needs more. I'm sure they're very aware that this piece of hardware needs to get a lot better and a lot more useful if they hope to be able to sell a V2 version of the Vision Pro. They can and are doing this with OS updates. The latest update (v1.02) really improved connecting external keyboards, so they are improving things, and I'm comfortable being patient while they work on the improvements.
 
Last edited:
The fact that the base M-series chips can’t do dual monitor, but the Vision Pro with the same chip can will 100% anger everyone.
if Apple releases dual screen support on the gen 1 AVP with M2, expect to see a whole host of angry M2 MBA/iPad Pro owners.
They can do dual monitor. The MacBook screen and one external.
 
What might work just as well as putting up a second remote monitor window is to allow expansion of the single monitor. Say you have a 14” MBP you’re using on your AVP. Drag the window to make it twice as big without scaling the contents and just have a bigger canvas to work with and have twice as many apps running in the one window. That might be easier than creating a second monitor window.
This would also be very welcome but also remember that macOS only supports split screen for full space apps. It’s harder to organize on one big canvas. Multiple smaller ones allow things to be more organized to me.
 
Was just going to mention Immersed. If the Quest 2 or Quest 3 can run multiple displays using an app like that, there's no technical reason that the Vision Pro shouldn't be able to do the same.
How about bandwidth? Quest does 1080p. This is 4k.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.