Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's funny how Dunning–Kruger effect works. 9 out of 10 backs and forths on MacRumors are always initiated by some guy who thinks he knows the "facts" but can't even read.
Good insult, using D-K, and fair, since I insulted you. But also hilarious, given the stuff you've been writing.
[*]Just because there is AR in ARKit doesn't mean Tim Cook had already an AR headset in the works back in 2017. You're just hung up on semantics. You also might want to check what purpose ARKit actually served in 2017 before all your word posturing.
It's been 6 years, but IIRC, the big innovations were plane estimation, scale estimation, and motion tracking. I think there was also lighting estimation and some other stuff. It was a really good start, which they iterated on steadily over the years, up to today. That's how a serious organization breaks new ground.

If you think they were doing all that AR work just for a phone/pad... well, there's an old saying about certain things you can't fix.
[*]What's funny is for someone who assails others for lack of historical knowledge, you actually said "There is no software like. There is no hardware like it." as if this is the first AR headset the world has ever seen. I'm not surprised judging from your boastful tone but would just like to point out the irony.
I can't help your lack of reading comprehension. It is not "as if this is the first AR headset the world has ever seen". But it's the first with this level of sophistication, refinement, and capability across many different broad areas.

I just can't be bothered to give your the references given your unhelpful attitude.

Apparently "extrapolation" isn't one of your cognitive abilities. Do you really need a study specifically on AR headsets to tell you prolonged use of a screen a few inches from your eyes hurts your vision? Like, seriously? Why do you think as we age, we become more and more likely to get cataracts? And is there an ophthalmologist specialized in vision correction surgery in your family that you get to talk to as you wish? I don't think so.
I would expect my middle schooler to reason better than that. Eye muscle strain depends on focus distance, not actual screen distance. As for cataracts... well that's a different argument, but I am unaware of anything that suggests screens cause cataracts. Perhaps such evidence exists? Put up or shut up.

(Cataracts occurring later in life are at least partially (maybe mostly or almost all?) a classic expression of "late lethal" genes, though in today's world they're no longer lethal.)

Yeah, I will say it again, Apple copies Meta when it comes to Spatial Personas. If you've actually been following AR/VR since Facebook bought Oculus, you'll know Apple only started seriously thinking about an AR/VR headset after Mark Zuckerberg went all in on Metaverse. The fact that you're boasting about your ignorance and actually expect anybody to take you seriously is actually quite amusing.
The fact that you think Meta came up with this is hilarious. Enjoy your amusement, I'm certainly enjoying mine.

You know, I had more to say, but you're not arguing in good faith, unlike a number of other people I've engaged with here. So I won't bother.
 
Last edited:
I‘m not confusing mini and microLED. I just don’t think microLED would be as pitch black as OLED. Color degradation is another story.
MicroLED and MicroOLED are 2 different things though.

Here are all the micro display technology we have currently from cheapest to most expensive:
1. organic light emission diodes (OLED)
2. mini-light emission diodes (mini-LEDs)
3. micro-light emission diodes (micro-LEDs)
4. micro-organic light emission diodes (micro-OLEDs)
 
I'm guessing you're talking about the MBA and not the iPad Air. On iPads it's quite different where the screen produces a lot of heat at peak brightness especially the 12in Pros, and there isn't much that can be done passively to limit that. Given the high res, high refresh rate displays and its proximity to warm skin, I would wager the thermal envelope of the vision pro is a lot more similar to the iPad. By the looks of the side vents the vision pro probably already requires active cooling running just visionOS. If you run macOS on top, it would need something extra so it wouldn't overheat.
You're right about the ipad, but your analysis is wrong. The heat from that ipad isn't going to be concentrated down into the little screens in the Vision. Most of the heat comes from the LEDs illuminating, not the logic. They're physically smaller in the Vision and so give off correspondingly less heat. (I don't know the %ages, and would be curious to hear from someone with actual data about this. Does 3% of the surface area of the ipad translate to ~3% of the heat? Or 6%? or...)

Adding MacOS idling will not change things all that much. You're already paying idle costs on the M2 (which are pretty small); the incremental compute cost for running MacOS is just not that high. And when it's not idling, it's doing stuff you want done anyway, so you'd be doing it native if not in MacOS. So not much change there either.

The Vision does have active cooling; it draws in air from below (and presumably expels at the top though I don't know that for certain yet).

The advantage of a laptop on a flight really comes from the fact that it's one piece and rigid. Economy tray tables are usually about the size of a 16" MBP. That's really not enough space for both a keyboard and a mouse horizontally. Laptops are nice because you can have it hang off the edge a little without worry and the trackpad is built in. As for the assistive pointer, it's very similar to what the HoloLens and Quest Pro headsets do with hand tracking. It's fine for a click on a big button once in a while, but doing something that needs a lot of precise clicking like moving the cursor in a word processor would be a huge PITA since you need to aim in mid-air and hold the position while performing a click gesture.
Dunno what economy flights you're on; my 13" MBA more than fills the tray tables I've used recently. More to the point, because the screen is up, the laptop takes up a substantial *volume*, not just surface area. A kbd/trackpad, much much less volume. (This matters especially when the person in front of you reclines their seat!)

The assistive pointer won't be good enough for everything but it will suffice for some use cases. For the rest, you bring a trackpad. Different levels of (sub)optimality are inevitable.

You're right that it's debatable. It might even be better for sales in the short term. But the point is that the iPad apps would be as nice to use as Android tablets apps even today if not for Apple limiting iPhone apps when they launched iPad. The holdouts who are still only on macOS are the pro apps that haven't adapted their apps for iPad. Apple probably wants as many of them running natively as possible since they are position vision pro as the future of computing.
Hm. In what way did Apple limit iphone apps, such that it had such a driving impact? I feel they did way more pulling (creating opportunity) than pushing.

Pushing sales might be overridingly important in building enough of an installed base to encourage development. (I don't actually think so but it's plausible.)

For those less intensive scenarios, I'm sure it'll be totally fine. But then those scenarios are also already well-served by iOS/iPadOS/visionOS. If my work didn't need any type of dev tools or design apps, I can totally already do all of my work on the iPad. The tasks that you need a Mac for is usually either multi-tasking heavy or performance heavy, which means that in turn, the vision pro would need to have more performance for macOS to be useful on it.
I think the advantage of Vision over ipad is the vastly larger display area/volume, and potentially much less restricted interface. Compute power matters for some people, but not all. Maybe not even the majority.

In the end, the cross section of people who would buy this for the built-in macOS would be limited to those who 1) need very limited computing power yet whose needs are not fulfilled by visionOS/iOS apps, 2) has enough budget to purchase something that has significantly less performance for a much larger amount of money, 3) okay with keeping it plugged in all the time or carrying a huge power bank, 4) never need to show other people the screen they are working on or plug into anything to present, 5) okay with noisier cooling or a heavier/larger device. All that in exchange for not carrying a MBA alongside the headset. I'm sure these users exist, but I find it hard to believe that there are enough to justify a whole sub product line for them, especially at a higher price.
For the reasons expressed above, I think you're dramatically underestimating the Vision's capability - each of those factors covers some real-world cases, but not nearly as many as you think. Perhaps if we're lucky, we'll find out if you're right.
 
An M2 is a joke for serious VR applications, especially given what those gorgeous LED panels are capable of. I’m still holding out for some serious tethering capabilities, otherwise this thing is going to be underpowered by almost an order of magnitude of where it needs to be.
You're just handwaving. "Serious VR applications" means nothing. Each is different and will require different amounts of CPU/GPU.

That said, what we have now may not be at all what we get once they actually ship in seven (or more!) months. As I posted 180 messages ago or so, they obviously aren't going to talk about it now, but they could easily ship with an M3. By end of January, they should easily be able to give up some wafers from A17 production.

In general, look on the current hardware as a dev test unit. The final shipping product may be identical... but probably won't be.
 
[...]I am on the fence for a Series 0/Gen 1 product. If it includes M3 the heat and power reduction would make the external battery last 3-3.5hrs and have the device run cooler or without any fans leading to a thinner product.
Not even close. Power reduction from N3B is ~20% iso-speed, but the M2 is only a part of the overall power budget. It will help, but not that much. Most likely, 15 minutes or less.

I would love to see the M3 too, but probably the biggest improvement in power use will be improved display tech. That will likely happen in moderate small increments.
 
The people's hands are free in my Wall-E example too... free to hold junk food!

There is no way a device like this is going to make people 'more' active.

Long term it's going to decimate human social skills and relationships.
If this is successful it will be even worse than the damage caused to society by smartphones.

Birth rates, creativity and mental and physical health will all plummet even more than they have already.
Kid's eyesight will be destroyed.

Watching tech journalists gushing about the realism of this is terrifying.
You know what else is realistic... reality ...and it's free.

People please get this rubbish off your head, go outside, breathe the fresh air...
...and get a damn girlfriend that isn't made from pixels.
The future needs you!
Ah, another crystal ball owner with a perfect view of the future.

You know *nothing*. You are extrapolating from a tiny amount of data. You could be right, but there are a million ways you could be wrong. You think someone sitting in San Fransisco in 1880 could have predicted what it would look like in 1980? Or now? This is like that, but much more so.
 
During the demo I could see only one mac screen virtualized. Could you show me where you saw multiple mac screens? And the wifi bandwidth is somehow true. Keep in mind that we’re talking about a 3-4K monitor with a refresh rate of 60hz. That’s a lot of bandwidth and also a lot for the headset to process.
You may be right- it's possible I took a native app for a second MacOS screen.

My fundamental point stands though. Foveated rendering is a great way to discard data and work less hard; similar schemes are possible for virtual displays. Things like VNC/RDP/etc. deal with constrained bandwidth in a somewhat similar way.
 
Jobs would be turning on his grave with the battery pack that goes on your pocket. The quest 3 is looking pretty good for $500 and dongle free.
 
Think about it: higher refresh rate always needs more gpu power. Games, videos, and even a smoother desktop interface require more gpu power to run. When things get glitchy is because of lack of gpu power.
That's not true. Higher refresh rates are not inextricably linked to more GPU power. That's only true if the GPU rerenders every frame.

High rendering rates have different benefits from high refresh rates, though the two are linked in multiple ways.
 
WIRED hit the nail on the head. My emphasis, something I posted previously.

The level of sheer stupidity and shortsightedness on display in this Wired story is breathtaking. It reminds me of the classic misfires published about the mac, ipod, iphone, ipad, etc.
 
What I wanna know is the FOV. This a key spec they are intentionally withholding.
Yes, that's a very interesting question.

I do think it may be a lot less important than most people think though. With a large monitor setup, the monitor can't move. If you look to the side, you'll see ... no display. On the other hand, with the Vision, if you look to the side *with your head moving*, the display travels with your head and you see it. If you just glance with your eyes, not. But people tend to move their heads when looking sharply to the side anyway, because the centers of our retinas are way better for looking closely at things than the outsides, so you will normally get some level of that moving-display benefit.

This strikes me as something easily susceptible to study, but I have no hard facts and numbers on hand.
 
You're just handwaving. "Serious VR applications" means nothing. Each is different and will require different amounts of CPU/GPU.

That said, what we have now may not be at all what we get once they actually ship in seven (or more!) months. As I posted 180 messages ago or so, they obviously aren't going to talk about it now, but they could easily ship with an M3. By end of January, they should easily be able to give up some wafers from A17 production.

In general, look on the current hardware as a dev test unit. The final shipping product may be identical... but probably won't be.

For a benchmark, let’s start with the fact the m2 isn’t powerful enough to render demanding VR games (from the past few years) at the native resolution of the Vision’s spectacular panels (in terms of rasterization demands)—like not even close. You also won’t be able to use professional 3D creation tools at anything close to native resolution. I suppose they are counting on the AR component reducing the render requirements (they can render less, if most of your FOV is just pass through reality from the cameras), but fully rendered environments are going to basically be limited to 2015 era VR graphics (albeit at higher resolutions). The one caveat being maybe they are hiding some crazy foveated rendering special sauce which will reduce demands, but I think we would have heard about it by now.
 
So your point is that is is a very powerful chip, right?
I'm more concerned about spending $3500US ($5600 as an Australian, maybe more) and getting a last-gen chip. This is less about the performance and more about the 3nm chip increasing the limited 2h battery life.

I agree the M series chips are powerful, but if 3nm starts coming to laptops that already have good battery life, it is really necessary in the headset currently limited to a 2h battery
 
Just in case if you didn’t know. You can get an All-Day Use when it’s plugged into a wall plug. 🔌 ⚡️
I'm not sure that's a good idea. My wife often charges her insulin pump (still attached and functioning) while she sits on the couch reading. I can't count the number of times she's forgotten she was plugged in and has gotten up and walked away from the couch... and pulled her pump site out.

With this visor, I'd be worried people will do the same thing - and will either damage the visor's charging connector or will end up on their backs on the floor.
 
I'm more concerned about spending $3500US ($5600 as an Australian, maybe more) and getting a last-gen chip. This is less about the performance and more about the 3nm chip increasing the limited 2h battery life.

I agree the M series chips are powerful, but if 3nm starts coming to laptops that already have good battery life, it is really necessary in the headset currently limited to a 2h battery
I hear you, but imho the headset will not launch until Apr/May next year and a lot can happen between now and then…
 
What about reading glasses or progressive lenses? Too bad a built in hardware or software solution couldn't work instead of drop in lenses.
 
For a benchmark, let’s start with the fact the m2 isn’t powerful enough to render demanding VR games (from the past few years) at the native resolution of the Vision’s spectacular panels (in terms of rasterization demands)—like not even close. You also won’t be able to use professional 3D creation tools at anything close to native resolution. I suppose they are counting on the AR component reducing the render requirements (they can render less, if most of your FOV is just pass through reality from the cameras), but fully rendered environments are going to basically be limited to 2015 era VR graphics (albeit at higher resolutions). The one caveat being maybe they are hiding some crazy foveated rendering special sauce which will reduce demands, but I think we would have heard about it by now.
In fact that is exactly what they're doing, and they talked about it in the keynote.

Beyond that, you're right that (at least without that) it's inadequate for 3D games at current state-of-the-art levels of scene and object complexity, but for a great many applications, you don't need 60fps or even 30fps. Double-buffer with rendering into a backing buffer at (say) 3-5fps, then render the buffer at 90fps, and you may have something that's totally adequate in terms of interacting with 3D tools, and frickin amazing in terms of utility compared to other options. If you really need fast frame rates at state-of-the-art levels of detail, then, yes, you may or may not be stuck depending on how good the foveated rendering is.
 
In fact that is exactly what they're doing, and they talked about it in the keynote.

Beyond that, you're right that (at least without that) it's inadequate for 3D games at current state-of-the-art levels of scene and object complexity, but for a great many applications, you don't need 60fps or even 30fps. Double-buffer with rendering into a backing buffer at (say) 3-5fps, then render the buffer at 90fps, and you may have something that's totally adequate in terms of interacting with 3D tools, and frickin amazing in terms of utility compared to other options. If you really need fast frame rates at state-of-the-art levels of detail, then, yes, you may or may not be stuck depending on how good the foveated rendering is.

But the panels are sooo beautiful (i.e. capable) i'll be sad if I can't rock crazy rendered environments on them. I'm just going to pray for a tethering capability that I have no evidence for existing.

And on the bright side (pun intended), the insane contrast ratios (something I've been begging for on other HMDs) will make a huge difference in the AR pass through feeling truly real. So ultimately I expect the tiny 200x200 pixel rendered butterfly that lands on my hand in AR to look amazing (the M2, for all its limitations, should be able to handle that no problem).
 
Is it just me, or has literally nobody mentioned anything about FOV?

I’m aware that it’s not a spec that Apple have officially revealed, but I’ve watched multiple videos of people who’ve actually demo’d the device and none of them have commented on it at all
 
Is it just me, or has literally nobody mentioned anything about FOV?

I’m aware that it’s not a spec that Apple have officially revealed, but I’ve watched multiple videos of people who’ve actually demo’d the device and none of them have commented on it at all
Yes, it's just you, because you didn't even bother to read back on this page. See for example post #213.

It is strange that so little has been said. But as I mentioned above, FoV will need to be looked at in a new way with this headset. How new is subject to research, but I don't know of any (and I haven't looked).
 
Let me guess, the Zeiss Optical Inserts will add at least $1000 to the price. And god forbid, you have more than one person in the household or business that wears glasses. Add thousands more to the price.
 
Maybe you should have tried reading what I wrote — I clearly was asking your opinion about MicroOLED vs MicroLED — I never mentioned MiniLED.

And you say that “Not many people have experienced it since it hasn't really been used in consumer technology yet” — exactly right — so why are you spewing out anecdotal nonsense about a unreleased product using technology you haven’t experienced?
Why can't you follow simple logic? While it's not in consumer products yet, microLED does exist, you can see demonstrations of it, OLED has flaws which microLED doesn't, the headset was rumored to have microLED, it doesn't, therefore I have no interest. As I mentioned from the start, I wasn't going to buy one because of the price anyway, but OLED made sure of that. While Apple apparently doesn't have much choice at the moment, it's a compromise technology while waiting for something better. But you're clearly just here to argue, not to mention weirdly defensive of OLED, so I can't imagine any reply you might have will be worth anything.
 
...and there's no word on just how many cameras and sensors are inside.

I believe that during the keynote they mentioned 12 cameras, 5 sensors and 6 microphones.

But no other details yet.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.