Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Totally not surprised here. Supposedly my M2 Macbook Air can only support 1 external monitor but I've seen many videos that allow multiple monitors with a dock and Displaylink software. Apple probably wanted to maintain certain performance standards and thus limited it to just 1 display for the Vision Pro. Once AVP gets M3 or later chips, this limitation may go away.
M1 through M3’s are hardware limited to two displays (internal + external or two externals only) because of the lack of display controllers on the SoC to support more. With virtual screens, they are nothing more than windows where, depending on how they’re doing it, I’d guess they’re using one display controller per eye. But within each display (eye), every app is just another window. I doubt there are any artificial restrictions on how many windows they can show virtually, though they might have practical limits due to available system resources.

The likely limitation is probably due to time to test and debug, or perhaps optimization. As a 1.0 VisionOS version, we can’t expect everything to ship all at once. It’ll be at least a couple years before we see a Vision Pro 2.0 hardware version if not longer, so the 2.0 VisionOS release should have a ton of new things for the current hardware.
 
The fact that the base M-series chips can’t do dual monitor, but the Vision Pro with the same chip can will 100% anger everyone.
if Apple releases dual screen support on the gen 1 AVP with M2, expect to see a whole host of angry M2 MBA/iPad Pro owners.
 
But I don’t understand? I thought this product was shipped as is and will never get improvements and is forever frozen in a day one state and that’s why everyone is critiquing the hell out of it? 🤔 /sarcasm
 
  • Sad
Reactions: Victor Mortimer
The fact that the base M-series chips can’t do dual monitor, but the Vision Pro with the same chip can will 100% anger everyone.

The M2 Mac mini can output one 6K stream and one 5K stream. Other M1/2/3-based Macs aren't really any different, it's just that on laptops, one of those streams is the internal display.

Similarly, the Vision Pro renders one stream for each eye.

if Apple releases dual screen support on the gen 1 AVP with M2, expect to see a whole host of angry M2 MBA/iPad Pro owners.

Angry about… what exactly?
 
Like Mac, iPhone, iPad, and Apple Watch, there's undoubtably a ten year pipeline built for the product. Give one feature, and have an aggravating nice to have just out of reach for a few years... rinse, repeat. Encouraging to see people dive in to the VP, but surely a few of us have been bitten by Rev A Apple products, and opt to have the product go through some revision cycles first.
 
1707239080851.png
1707239083756.png


Well, when we get the equivalent of this, I'll start thinking of the AVP as a serious production tool… until then? Not so much.
 
It doesn't make the slightest difference. There is zero benefit to using Vision Pro in order to get any number of virtual displays. The ONLY appeal of Vision Pro is running Apps in an unlimited canvas without the boundaries of fixed displays. The usage with the Mac is the exact opposite and counter intuitive. I might as well use real displays, and not have to use the headset, since there is no additional benefit. And certainly none that outweighs (pun intended) having to wear the headset for any length of time.
 
View attachment 2346414View attachment 2346415

Well, when we get the equivalent of this, I'll start thinking of the AVP as a serious production tool… until then? Not so much.
Why? You already can have the real thing, right now. There is no reason to wear a headset and pseudo recreate that. The only reason to wear the headset would be if you could run the apps themselves in an unlimited canvas. You can't. And won't. Not Mac apps. This is a dead end.
 
Windows/apps don't need to be constrained to a boxed display/window at all, especially with spatial computing. Let Mac apps and windows roam free like all other VisionPro apps. Either that or just allow resizable aspect ratios.
Yeah this is the true approach. The user should see an “infinite desktop” with as many windows they want and if they’re in a spinny office chair, be able to spin around and look at different parts of their desktop at a time.

Constraining it to “number of screens” is no longer a requirement.
 
  • Like
Reactions: dmi and Arran
Wi-Fi 6 does 9.6 Gbit/s under very ideal circumstances. Transmitting a 5K image requires (2560*2*1440*2)*3*60 = 2.5 GiB/s (or ~20 Gbit/s) at 60 Hz, not to mention twice that at 120 Hz ProMotion. Oh, and that's only with 8 bits per color; for 10 bpc, you need another 33% more.
Wifi 6 tops out at 600-800mbps real world for most apple devices (only 2 spatial streams and 80mhz channels). Then apple is just using the mirroring software stack from the iPad mirroring feature. It likely uses HEVC, for low delay 4k mirroring it needs about 40mbps.

Those 40mbps per screen need to be rock solid not to experience glitches. That's probably why 2 displays or more could be challenging as you need N times the amount of flawless wireless bandwidth. (40mbps * N)
 
  • Like
Reactions: kc9hzn
Windows/apps don't need to be constrained to a boxed display/window at all, especially with spatial computing. Let Mac apps and windows roam free like all other VisionPro apps. Either that or just allow resizable aspect ratios.

Isn’t this what everyone wants? Separate Mac windows


That is the ideal, but I think Apple is still working out the UX. For instance, Mac apps and their tap targets—I mean—sight targets? Not sure what the new HIG calls them. Their sight targets are probably too small, so when you pull them out into the world of things that are interacted with through sight, they feel as though they should also be controlled through sight and not a mouse. I could, perhaps, see them have a special kind of mode you can put it into for working with Mac windows, kinda like stage manager on the iPad in the sense that it's a different mode of using apps. It could be a more mouse-driven spatial mode for using a bunch of different Mac windows.

So I think the UX is one big component, and then figuring out the best way to optimize the hardware and software to deliver a consistent experience is the other part. There are a lot of things that haven't been figured out, like basically infinitely resizing windows. Like, at a certain point, do you keep making the window hold more contents, or start scaling it larger? Because if you keep scaling it to have more workspace, the relative size of the icons will get so small you can barely see them because of the resolution of the Vision Pro and also the limits of human vision. Like even if you had a 16K display on a wall, if you're using desktop Safari and you're sitting 8ft away on your couch, can you see all the icons and read the address bar?. But if you take an app optimized for, say, 4K max size and then start blowing that up to appear larger, that would have a different affect on UX. So there is probably a lot of that stuff to figure out first, and with the project being opened up more across Apple, more people can get their eyes on it with lower "clearances" to make improvements.

That's honestly probably why first gen products at Apple are improved upon so rapidly and then level off. Once the secrecy is reduced, you get all kinds of cool collaborations across the company and new ideas injected into it. What they've shipped is their MVP, or minimum viable product. And that's why I didn't buy a first gen. $3500-4000 is way too much to be a tester for something that I don't personally need at this level of development. The iPhone was instantly better than other phones. The iPad was instantly better than other tablets. Both had reasonable entry prices. The Apple Watch is the most beta Apple product I've ever used at launch, but at least the entry price was low. But this? No. You can get a really nice M3 Max 14" MBP for around that price, and that's what I did.

Once some of the initial kinks are worked out, things are improved like latency, glare, comfort, and I can use a Mac in a more fully spatial way, then I will be much more interested in purchasing one.
 
The fact that the base M-series chips can’t do dual monitor, but the Vision Pro with the same chip can will 100% anger everyone.
if Apple releases dual screen support on the gen 1 AVP with M2, expect to see a whole host of angry M2 MBA/iPad Pro owners.
Except there aren’t two actual screens from a remote Mac. The only two screens are the left and right eye of the Vision Pro. Everything else is a window. If they show 3 ”monitors” from a MacBook Air someday, for instance, that’s just three windows to VisionOS. If people get angry about that, it’s because they don’t understand how the software works. You don’t need a separate display controller to stream a remote Mac. Besides, a MacBook Air DOES support two screens. People just forget to count the built-in display as one of them.
 
Vision will be restricted to one, Pro will let you drive 2, if they can fool the user into thinking they need to buy Vision Pro 2 to get that functionality they will. There are a bunch of things not added to the first version, it's how they look like they are giving you so much when you upgrade the OS or device. And what they use to try and justify two price points on hardware that should technically be able to handle the same tasks, just one does it without getting as hot as the other.
 
  • Like
Reactions: Victor Mortimer
apple is just using the mirroring software stack from the iPad mirroring feature.

I don't think so. They seem to be using the new macOS Screen Sharing protocol.

It likely uses HEVC, for low delay 4k mirroring it needs about 20mbps.

Yes, it's possibly they use lossy encoding, which… I'm not sure how to feel about that. Seems inadequate for something high-quality. (Kind of clashes with having Retina displays. Like, on the one hand, very high pixel density, but then a pixel mush?)
 
  • Like
Reactions: xmach
Why? You already can have the real thing, right now. There is no reason to wear a headset and pseudo recreate that. The only reason to wear the headset would be if you could run the apps themselves in an unlimited canvas. You can't. And won't. Not Mac apps. This is a dead end.
You know @AlastorKatriona , you keep mistaking me for someone who actually thinks the AVP is a good thing.

When I look again, there you are… choosing to pick a fight with me on a hill you have chosen… *weird*
 
but only on an M1/M2/M3 Pro or Max right? I mean nooooooo way can a regular M chip support two external displays at the same time.... impossible! Only Intel has the brains to figure that out.
 
  • Like
Reactions: gusmula
I don't think so. They seem to be using the new macOS Screen Sharing protocol.



Yes, it's possibly they use lossy encoding, which… I'm not sure how to feel about that. Seems inadequate for something high-quality. (Kind of clashes with having Retina displays. Like, on the one hand, very high pixel density, but then a pixel mush?)
High bitrate HEVC is fairly decent for screen recording/sharing, you'll only see issues if you look for them. There's literally no way to pull this off without using lossy compression at the moment, and it has to be down to the <50mbps range to be feasible...
 
  • Like
Reactions: kc9hzn
bet you there are some engineers at Apple reading these forums on a M4 Mac ...
it really does come as a surprise that Apple engineers have access to future technology, I always though that comes from the likes of Gurman and Kuo ;) /s
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.