What We Know About the Apple Vision Pro Hardware So Far

Yes, it's just you, because you didn't even bother to read back on this page. See for example post #213.

It is strange that so little has been said. But as I mentioned above, FoV will need to be looked at in a new way with this headset. How new is subject to research, but I don't know of any (and I haven't looked).
True, I didn’t read the 200+ posts preceding my own, but when I said “anybody”, I guess I should’ve have clarified anybody WHO WAS AT THE MEDIA EVENT AND ACTUALLY USED THE DEVICE, THEN POSTED COVERAGE, as opposed to random people speculating in comments - I just found it odd that nobody (WHO WAS AT THE MEDIA EVENT AND ACTUALLY USED THE DEVICE, THEN POSTED COVERAGE) said anything like “the FOV is great” or “the FOV is lacking”, or basically any other comment eluding to the FOV quality

Thanks for your reply!
 
Each screen renders the scene from a different perspective in order to show a stereoscopic image. That's two separate 4.5k images. From the perspective of computational power, it is indeed functionally double the framerate.
Yeah I mean, each screen shows a different image, at 90hz, not 180hz. So the headset only got two 4.5k screens at 90hz each.
 
That's not true. Higher refresh rates are not inextricably linked to more GPU power. That's only true if the GPU rerenders every frame.

High rendering rates have different benefits from high refresh rates, though the two are linked in multiple ways.
But in the case of the headset, the GPU is rendering at all times. Every instant it has to understand space and create a new frame on the screen that makes you feel like the floating screens are actually there. There is no pause there for the GPU.
 
But in the case of the headset, the GPU is rendering at all times. Every instant it has to understand space and create a new frame on the screen that makes you feel like the floating screens are actually there. There is no pause there for the GPU.
This is wrong in at least two ways.

- Foveated rendering makes a VERY big difference. It will likely be a while before we have a decent understanding of Apple's implementation, but it's plausible that we might see anywhere from a 50-90% (!!) reduction in frame rendering times. Due to the low latency display (90Hz, ~11ms/frame) and high accuracy of the eye tracking, Apple's likely to do substantially better than other recent implementations. Of course that only matters where the GPU is working hard.
- Don't confuse different types of GPU work. If you have to render a 3D scene, like in a video game, that's one thing. But showing the environment around you, as captured by the cameras, is another thing entirely. Most of the GPU is going to be idle for that. And 2D displays appearing virtually in your actual environment will require some simple projection math but still require very little in the way of GPU resources.
 
...and there's no word on just how many cameras and sensors are inside.

I believe that during the keynote they mentioned 12 cameras, 5 sensors and 6 microphones.

But no other details yet.
Not so. They showed where each of the sensors is on the hardware, both outward- and inward-facing. I think there's an illustration of this on Apple's web site.

We don't yet know camera resolution and other specs (sensor size, etc.).
 
WIRED hit the nail on the head. My emphasis, something I posted previously.


Wired is correct. Perhaps 7 years ago or even a decade this was a nice pivot for Cook to please shareholders that apple had something going as iPhone matures. Along with the mythical apple car.

I said good luck then and still do. I mean it’s not like I don’t want this kind of tech. I’ve read enough books fantasizing about vr worlds, games, etc.

To have a chance, apple needs to make it worth my while to don this mask. Watching a movie? Really? They even brought Disney on stage. And that’s your problem right there.

What is apple doing? This is your decade in the making headset and you cart your competitor on stage to showcase the ultimate movie or whatever experience. What are the people doing atv sitting there thinking? Where’s the vision?

As in vision tv. Vision gaming. Because quite frankly apple needs something magical here. And it ain’t Disney. I want to see games or other apps that make the others look sick. Apple has to be willing to spend the cash and take the lead.

Remember the whole software/hardware advantage? I realize apple wants another App Store and that revenue. They can still have those indie type apps.

But the only way they’ll get truly game changing experiences is to do it yourself. That’s a cash sink. Apple has the cash. Do they have the will to double down?

I’m very skeptical they do. The Facebook guy knows this. He changed the name of the company. Because the only way his risky bet has any chance is to go all in.

I’m certainly not advocating apple does the same or does anything to get away from iPhone. But to me this is a hobby like Apple TV until apple proves otherwise. An expensive hobby.

Poor AI doesn’t bode well for apple. Siri is a joke. Its major upgrade this time around? They dropped the “hey” part. Are you kidding? That’s it?
 
Good insult, using D-K, and fair, since I insulted you. But also hilarious, given the stuff you've been writing.

You know, I had more to say, but you're not arguing in good faith, unlike a number of other people I've engaged with here. So I won't bother.
You expect people to argue with you in good faith after admitting that you've insulted them. What can I say? "There is an old saying about certain things you can't fix"
 
Last edited:
The level of sheer stupidity and shortsightedness on display in this Wired story is breathtaking. It reminds me of the classic misfires published about the mac, ipod, iphone, ipad, etc.
You know that old saying about hindsight? You should just come out and make a firm prediction on the future of Apple's AR/VR efforts (so I can really hang you later). You obviously think of yourself as the smarter one in this little shindig of ours. Make a move.
 
Yeah I mean, each screen shows a different image, at 90hz, not 180hz. So the headset only got two 4.5k screens at 90hz each.
You’re getting really caught up on semantics here. Sure, it doesn’t alternate drawing frames between the screens, but it is drawing 180 unique frames per second.
 
Not even close. Power reduction from N3B is ~20% iso-speed, but the M2 is only a part of the overall power budget. It will help, but not that much. Most likely, 15 minutes or less.

I would love to see the M3 too, but probably the biggest improvement in power use will be improved display tech. That will likely happen in moderate small increments.
To be fair we really don’t know if Apple used M2 as a starting point (similar to using an Ax chip in the developer MacMini prior to M1) and any future Silicon in AVP will be custom to its requirements as we have seen in the past with iPhone and iPad Pro Ax chips.
 
The people's hands are free in my Wall-E example too... free to hold junk food!

There is no way a device like this is going to make people 'more' active.

Long term it's going to decimate human social skills and relationships.
If this is successful it will be even worse than the damage caused to society by smartphones.

Birth rates, creativity and mental and physical health will all plummet even more than they have already.
Kid's eyesight will be destroyed.

Watching tech journalists gushing about the realism of this is terrifying.
You know what else is realistic... reality ...and it's free.

People please get this rubbish off your head, go outside, breathe the fresh air...
...and get a damn girlfriend that isn't made from pixels.
The future needs you!
I have those products and also human interaction, it’s up to the individual to make choices for their overall health balance.

I also didn’t or don’t need an AW to motivate me to be active but some do, the doom and gloom scenario you paint may effect a minority, people will be fine especially those who can afford $3500 😝
 
I would expect my middle schooler to reason better than that. Eye muscle strain depends on focus distance, not actual screen distance. As for cataracts... well that's a different argument, but I am unaware of anything that suggests screens cause cataracts. Perhaps such evidence exists? Put up or shut up.
Focus distance and actual screen distance are one and the same in most real-life cases. You're just splitting hairs and you know it.

Cataracts are just one of many eye disorders. One thing I know for sure: your vision won't ever get better with prolonged screen use. Our lifestyle shapes our health and modern lifestyle doesn't always equate good health. I don't even know what you're trying to prove here. Are you saying putting a pair of goggles with screen inches away from your pupils will have no adverse effect on one's vision? If you're just questioning the lack of studies on the effects of AR headsets on one's vision, then sure. There are still some people that are waiting on a definitive study on the effects of human activities on the climate too.

You let your pride get the best of you by constantly misrepresenting what others say. This seems to be a pattern with you. You really come across as a third-rate engineer who has a fragile ego and who can't get enough satisfaction from his job that he has to get it by flexing technical knowledge on an open forum.

And I hate to repeat myself but just because Apple copied something from Meta doesn't mean Meta invented the said something. The fact that you're so willful in your attempt to misconstrue my statement tells me you aren't remotely interested in the truth. Despite your putting Apple on a pedestal, Apple wasn't the first player in AR/VR. Oculus had a protype as early as 2012. Where were Apple and its ARKit in 2012?

I realize you need this more than I do so have your last word and be done with it. ;)
 
I don't disagree with what you've mentioned. However, most people are not going to be using these devices on the go or working out with them on very often. The majority of the time these will need to be plugged in due to the 2 hour max battery life. Imagine working out with this thing on all the time. It would be nasty and hot on your face. I'd wager that the vast majority of users will be using them stationary the majority of the time. With all of the gyms, diet programs, medications, etc. we have available to us, obesity is still on the rise in the US. Sad.
I am curious if the external battery pack can be daisy chained like Batman’s utility belt with pouches and offer up all-day mobile computing. People have to make their own health choices similar to AW encouraging some to be active and others an accessory.

We can say the same thing about desktop, laptop, phones and tablet computers. It’s one’s mindset not the devices.
 
50%This is wrong in at least two ways.

- Foveated rendering makes a VERY big difference. It will likely be a while before we have a decent understanding of Apple's implementation, but it's plausible that we might see anywhere from a 50-90% (!!) reduction in frame rendering times. Due to the low latency display (90Hz, ~11ms/frame) and high accuracy of the eye tracking, Apple's likely to do substantially better than other recent implementations. Of course that only matters where the GPU is working hard.
- Don't confuse different types of GPU work. If you have to render a 3D scene, like in a video game, that's one thing. But showing the environment around you, as captured by the cameras, is another thing entirely. Most of the GPU is going to be idle for that. And 2D displays appearing virtually in your actual environment will require some simple projection math but still require very little in the way of GPU resources.

In full rendered environments, the GPU will always be working hard. The m2 is basically like a NVDA 1060 in terms of 3D rendering capability (i.e., worse than a PS4 Pro), so they'd need > ~90% reduction in render requirements from foveated rendering to be on par with a top line desktop GPU. That seems like wishful thinking, but I guess I will still remain hopeful. And given the insane high resolution of the HMD panels, there are already many apps that would bring even a 4090 to its knees trying to render 3D graphics at 90fps (at that resolution).

All this to say is that the m2 will be a serious bottleneck on the capabilities of the HMD from day one. Perhaps that goes out the window if they can implement some sort of tethering to external computing power (although I'm thinking that is less likely as the days go by given their emphasis on low latency).
 
Last edited:
- Don't confuse different types of GPU work. If you have to render a 3D scene, like in a video game, that's one thing. But showing the environment around you, as captured by the cameras, is another thing entirely. Most of the GPU is going to be idle for that. And 2D displays appearing virtually in your actual environment will require some simple projection math but still require very little in the way of GPU resources.
Then why would apple miss the opportunity to announce the headset along with pro apps like FCP and Logic pro? I thought it was simply a GPU issue, fixable in next gens of the headset.
 
I am curious if the external battery pack can be daisy chained like Batman’s utility belt with pouches and offer up all-day mobile computing. People have to make their own health choices similar to AW encouraging some to be active and others an accessory.

We can say the same thing about desktop, laptop, phones and tablet computers. It’s one’s mindset not the devices.
I used an external battery pack with a VIVE Pro. The external battery while small and providing a few hours of juice seemed to be the correct size. I bought a larger battery and it made the experience worse not better. In the end I wasn’t going to wear the headset for hours on end. The larger battery didn’t help me in the end.
 
You know that old saying about hindsight? You should just come out and make a firm prediction on the future of Apple's AR/VR efforts (so I can really hang you later). You obviously think of yourself as the smarter one in this little shindig of ours. Make a move.
I'm not making a firm prediction because I recognize that there are all sorts of externalities that are unpredictable. But OK.

Assuming Apple actually stays the course, and the world economy doesn't tank, I predict that the Vision product line will be a massive success. Possibly in 3-5 years, possibly 7-10, but a massive success.

What I was actually going on about mostly is how astoundingly stupid it is for people to state with certainty that it's going to fail, or cause terrible changes in social interaction. So much hubris to think you know the future that well. Most such people act like they've never even heard of the concept of the accelerating tech curve. It's embarrassing.
 
To be fair we really don’t know if Apple used M2 as a starting point (similar to using an Ax chip in the developer MacMini prior to M1) and any future Silicon in AVP will be custom to its requirements as we have seen in the past with iPhone and iPad Pro Ax chips.
That's a good point, and I'm certain that you're right that they will make a custom CPU/GPU/NPU/etc. chip for the Vision line, *once their volume is sufficient to justify it*. What volume would justify it? Well, clearly, 40M/year would do the trick as that justified a chip for the iPad (they sold 40M ipads in 2011 and came out with the first X model chip (A5X) in 2012). Who knows what the minimum is? They may well be willing to do it at 5M/year as an investment. Or even 1M. No way to know ahead of time.

In the short term, with engineering (chip design) resources constrained, the M series may be a good enough match for their needs. I doubt that we'll see such a custom chip for Vision before 2025, and wouldn't be surprised to see it take until 2027 or even later. After all, the stuff that really calls for custom work can go into the R line of chips.
 
I used an external battery pack with a VIVE Pro. The external battery while small and providing a few hours of juice seemed to be the correct size. I bought a larger battery and it made the experience worse not better. In the end I wasn’t going to wear the headset for hours on end. The larger battery didn’t help me in the end.
With the Vive Pro the battery is providing longevity for the HMD. With the Vision Pro, the battery is providing the necessary wattage/voltage to operate the entire system, including graphics operations.
 
Focus distance and actual screen distance are one and the same in most real-life cases. You're just splitting hairs and you know it.
Are you serious? Do you know anything about how these headsets work?

Focus distance is completely divorced from actual screen distance. That's sort of the whole point in a 3D headset.

[blah blah boring insults blah]
And I hate to repeat myself but just because Apple copied something from Meta doesn't mean Meta invented the said something. The fact that you're so willful in your attempt to misconstrue my statement tells me you aren't remotely interested in the truth. Despite your putting Apple on a pedestal, Apple wasn't the first player in AR/VR. Oculus had a protype as early as 2012. Where were Apple and its ARKit in 2012?
You think 2012 marks the start of this? I wore a video game 3d headset at an arcade in the south street seaport back around 2000. It wasn't new then. The first consumer headset (?) was the nintendo "Virtual Boy" in the early/mid-90s (it was crap). There were also amusement park rides around then.

I don't think Apple (or Meta) invented this concept. I was laughing at your idea that Apple did this as a reaction to Meta. What Apple has done pushes the state of the art ahead in really significant ways. It also pushes that level of tech it into the reach of consumers for the first time, and at least as importantly enables that kind of refinement in third-party apps.
 
Yes, that's a very interesting question.

I do think it may be a lot less important than most people think though. With a large monitor setup, the monitor can't move. If you look to the side, you'll see ... no display. On the other hand, with the Vision, if you look to the side *with your head moving*, the display travels with your head and you see it. If you just glance with your eyes, not. But people tend to move their heads when looking sharply to the side anyway, because the centers of our retinas are way better for looking closely at things than the outsides, so you will normally get some level of that moving-display benefit.

This strikes me as something easily susceptible to study, but I have no hard facts and numbers on hand.
its more than that. I used the Quest 2 the the FOV horizontal is around 90. higher models go to around 100. the human eye is 135. So if you've never used a VR unit before, i can tell you it can be visually claustrophobic so if apple is gonna charge this kind of price it better better way above average and not visually claustrophobic cause 90 still feels like you are looking thru a hole in the wall.
 
In full rendered environments, the GPU will always be working hard. The m2 is basically like a NVDA 1060 in terms of 3D rendering capability (i.e., worse than a PS4 Pro), so they'd need > ~90% reduction in render requirements from foveated rendering to be on par with a top line desktop GPU. That seems like wishful thinking, but I guess I will still remain hopeful. And given the insane high resolution of the HMD panels, there are already many apps that would bring even a 4090 to its knees trying to render 3D graphics at 90fps (at that resolution).

All this to say is that the m2 will be a serious bottleneck on the capabilities of the HMD from day one. Perhaps that goes out the window if they can implement some sort of tethering to external computing power (although I'm thinking that is less likely as the days go by given their emphasis on low latency).
You're right that it will be a bottleneck for high-end 3D gaming, and other apps that require high-end full-view 3D rendering (though perhaps surprisingly less of a problem than you'd expect due to the foveated rendering- we'll see).

My point is that this is not an issue for most use cases. Specifically, displaying your actual environment is not a big reach for the M2 - most of the rendering pipeline will be idle for that, even without foveated rendering.

Really, the GPU isn't a big issue for the majority of cases. The bigger issue is the imaging pipeline (latency in particular), but they seem to have nailed that.

Then why would apple miss the opportunity to announce the headset along with pro apps like FCP and Logic pro? I thought it was simply a GPU issue, fixable in next gens of the headset.
I thought it was right on the nose that they showed someone using a Mac version of a pro app (I forget, was it Logic?) in the Vision.

My guess is that they are working on something really spectacular to demonstrate how the Vision can enable a work style that is impossible with a 2D display. And I wouldn't expect that to be done this year. Maybe next WWDC they show us an FCP 3D. But this is hard, pioneering work. It could take longer.
 
its more than that. I used the Quest 2 the the FOV horizontal is around 90. higher models go to around 100. the human eye is 135. So if you've never used a VR unit before, i can tell you it can be visually claustrophobic so if apple is gonna charge this kind of price it better better way above average and not visually claustrophobic cause 90 still feels like you are looking thru a hole in the wall.
Agreed, the FoV isn't meaningless. Reports I've heard are that you can indeed see black deep to the side, but that it's not a real issue. Whatever the FoV is, it's apparently "high enough". For most people? All? We'll see soon...
 
Agreed, the FoV isn't meaningless. Reports I've heard are that you can indeed see black deep to the side, but that it's not a real issue. Whatever the FoV is, it's apparently "high enough". For most people? All? We'll see soon...
I just saw Gruber's writeup of his hands-on. He seems to be saying there was no visible border at all.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top