Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,493
37,781


There have been multiple signs suggesting that Apple's mixed reality Vision Pro headset is struggling to take off, both due to the super high price and the heavy design that limits the amount of time it can be worn for most people. What does the Vision Pro's floundering mean for Apple's work on future virtual and augmented reality projects?

Apple-Glasses-Triad-Feature.jpg

Vision Pro Interest

Interest in the Vision Pro was high in February when the device first launched because it was an all-new product category for Apple, but that didn't last. The Vision Pro is indisputably impressive, and it is mind blowing to watch a butterfly flit by so closely it feels like you can reach out and touch it, or to see the rough skin of an elephant as it walks right by you, but the magic quickly wears off for most.

vision-pro-headset-1.jpg

Early reviews found the Vision Pro hard to wear for a long enough period of time to incorporate it into a real workflow, and it was hard to find a use case that justified the $3,500 price tag. The Verge's Nilay Patel found the Vision Pro to be uncomfortably isolating, and The Wall Street Journal's Joanna Stern got nauseous every time she watched the limited amount of Apple Immersive Video content available. Reviewers agreed that watching TV and movies was one of the best use cases, but that makes for an expensive TV that can't be watched with anyone else.

Months later, sentiment hasn't changed much. There was a lot of demand at Apple Stores when the Vision Pro launched, and long lines of people wanted to give it a try. Once the demo was over, though, interest fell. As early as April, there were reports that enthusiasm about the Vision Pro had dropped significantly, and by July, there were reports of waning sales.

Apple-Vision-Pro-with-battery-Feature-Blue-Magenta.jpg

At MacRumors, a few of us bought a Vision Pro at launch, and those headsets are tucked away in their cases and essentially not pulled out much at all anymore, except to sometimes watch Apple's latest Immersive Video or to update to new visionOS software. MacRumors videographer Dan Barbera uses his Vision Pro once a week or so for watching content, but only for about two hours because it's painful to continue use after that point. MacRumors editor in chief Eric Slivka and I haven't found a compelling use case, and there's no content appealing enough for even weekly use.

I still can't wear the Vision Pro for more than two hours or so because it's too uncomfortable, and I'm prone to motion sickness so it sometimes makes me feel queasy if there's too much movement. The biggest reason I don't use the Vision Pro, though, is that I don't want to shut out what's around me. Sure, it's great for watching movies, TV shows, or YouTube videos on a screen that looks like it's 100 feet tall, but to do that, I have to isolate myself. I can't watch with other people, and I feel genuinely guilty when my cat comes over for attention and I'm distracted by my headset.

Watching movies on the Vision Pro is not a better experience than using the 65-inch TV in front of my couch. I am a gamer, but there aren't many interesting games, and a lot of the content that's available feels like I'm playing a mobile game in a less intuitive way. Using it as a display for my Mac is the best use case I've found, but it's limited to a single display and it's not enough of an improvement over my two display setup to justify being uncomfortable while I work.

vision-pro-macbook.jpg

Beyond our own experiences with the Vision Pro, MacRumors site traffic indicates a lack of interest in the headset. When we publish a story about the Vision Pro, people don't read it. I wrote a Vision Pro story about the first short film on the headset just yesterday, for example, and it was our lowest traffic article for the day. It probably wasn't worth my time to even do, and that's not an isolated incident.

There are enterprise use cases for Vision Pro, and some people out there who do love the headset, so it does have some promise, and Apple has been marketing it to businesses. Some examples, from Apple and others:
  • Porsche - Porsche engineers use the Vision Pro to visualize car data in real time.
  • KLM - KLM Airlines is using the Vision Pro for training technicians on new engine models.
  • Law enforcement - Police departments in California are testing Vision Pro for surveillance work.
  • Medicine - A medical team in the UK used the Vision Pro for two spinal surgeries. Doctors in India also reportedly use it for laparoscopic surgeries, and an orthopedic surgeon in Brazil used it during a shoulder surgery. UCSD has been testing the use of Vision Pro apps for minimally invasive surgery.
  • Science - MIT students wrote an app to control a robot using the Vision Pro's gesture support.
Tim Cook said in May that "half" of Fortune 100 companies bought a Vision Pro, but whether those Vision Pro headsets are actually in use and what for is unknown.

Confusion Over What's Next

With Vision Pro sales coming in under what Apple expected, we've seen some confusing rumors about what Apple's next move will be. There were initially rumors that Apple was working on two new versions of the Vision Pro, one that's cheaper and one that's a direct successor to the current model.

In April, Bloomberg's Mark Gurman said Apple would not launch a new version of the Vision Pro prior to the end of 2026, with Apple struggling to find ways to bring down the cost of the headset.

In June, The Information said Apple had suspended work on a second-generation Vision Pro to focus on a cheaper model. Later that same month, Gurman said that Apple might make the next Vision Pro reliant on a tethered iPhone or Mac, which could drop costs, and he said a cheaper headset could come out as early as the end of 2025.

In late September, Apple analyst Ming-Chi Kuo said Apple would begin productio... Click here to read rest of article

Article Link: Will Apple Ever Make AR Smart Glasses?
 
I like that MacRumors is doing more long form editorial, but “ the magic quickly wears off for most.” seems a bit unsubstantiated. Maybe you guys haven’t found many compelling use cases, but reading an AP Style Guide in Books might be a good one.
 
When the technology is ready, I do believe we will see such glasses from Apple, but is several years out.
Meta announced their prototype a few weeks back, but that thing is also years away from a consumer product.
only time will tell.
 
We saw similar doubters against the Apple Watch that took a few generations to start taking off, and now you can't turn your head in a city without seeing Apple Watches on every wrist.

Vision Pro was launched from the high end first, an early adopters funding the consumer product to come strategy. While this may be a different approach than what we're used to from modern Apple, this is the strategy Steve Jobs took with the Macintosh. The original Mac was just too expensive for the average user, and over time it was refined until the iMac became the computer for everybody.
 
“….it is mind blowing to watch a butterfly flit by so closely it feels like you can reach out and touch it….”

Er, go to the park.


“….or to see the rough skin of an elephant as it walks right by you….”

One word, three letters, starts with ‘z’.
 
My company and our fleet of AVP falls into the corporate use-case. We’re essentially trying to skate to where the puck WILL be but long ago concluded AVP, in this form at least, isn’t it. Our work is essentially evergreen so we aren’t worried about a pivot and it’s neat to see this nascent technology get off the ground. But this isn’t close to cruising altitude. AVP is a rarity with Apple: proof-of-concept struggling to find an identity and an answer to the problem it poses. It’s got some niche uses for which it is nice but it’s never going to penetrate the mainstream. Not this one. I’d say the conclusion is closer to where all this is going: AR glasses with some dingus in your pocket that powers it all (the phone can’t do it for a while). If anyone wants to buy one, I’d caution them to strongly reconsider even if that means the AVP as it stands now is a dead end for the company (there’s plenty of evidence suggesting it is). I’ll be interested to know where this COULD HAVE gone if Apple wasn’t so stubborn about the dumb car.
 
It is AR…
The compromise is so strong with AVP. And things like the iPad apps and using a Mac are escape hatches to try and salvage a use-case it’s not great at in either respect. I’d also say, based on dozens of folks trying out our dozen or so, that the eye tracking MUST get better or they need to rethink that and find a better interaction model
 
The Vision headset is near-useless.

AR glasses are the only device to make. You don’t need great display, it won’t be for watching movies, it will be transparent displays that can allow you to see content drawn in your vision, that is all.

It will be tied to extremely sophisticated AI voice like ChatGPT Advanced Voice Mode. NOT SIRI.

There will be no computation in the glasses, they will have cameras and sensors, but all computation will be on the iPhone in your hand or pocket 100% full stop.

Think of them like they give eyes to this insanely powerful AI that is with you and talks to you in natural language in natural ways where it’s indistinguishable from talking to another person standing right there with you.
 
So some silly journalists call it "the Vision Pro's floundering" as if they have the metric for what AVP should be. They do not.

All along AVP has been more like a Newton-type engineering investment than like a Newton spawn like the iPad. Y'all need to think long-term, farther ahead than just tomorrow's 1,000-word deadline.
 
Last edited by a moderator:
AR glasses are the future but not until they can look and feel pretty much no different than more conventional glasses. Being able to walk around and have the glasses remind me who a person is, give directions, identify places and things I see and show me answers to pretty much any question I may have will be great. It will take some years to reach that point but when they do the nature of any phone they pair with will change along with them.
 
All of this shows how amazing google glass was in its time.

Though the tech isn't there yet, the reason I like the idea of meta's glasses, (or even my apple watch), is it helps me become less addicted to the phone screen, and leave it behind more often.
 
  • Like
Reactions: sflagel and flybass
I don't really understand why they launched AVP really. I suppose they felt compelled to deliver something after years of work? I'm guessing. It's clear that a spatial OS needs some time to bake, so maybe it's just for that?

AR glasses are the next smartphones, and once they arrive will be a bigger revolution even (which is saying a lot since smartphones are a much bigger deal in the developing world than many give credit). But, as many are saying, they need to be actual comfortable glasses, and provide heads up info about the world. The challenges are many:

1. I still haven't seen a transparent display that can really provide the vivid colors and high def images on a wearable.
2. They'll need to make that super efficient so battery power can be contained in the frames (this is essentially scifi level at this point, literally requiring a breakthrough). I could see this taking 30 years...until then a tether.
3. They need all sorts of tiny sensors and cameras....seems like this we're close to achieving this, don't know about a power budget though.
4. We need powerful AI to do instantaneous image analysis of the real world. We're getting close here, but not on any sort of economical power budget. Good thing AI is making amazing progress.
5. We need ridiculous amounts of storage for all that double 4k video being shot (yes, we'll want it all saved indefinitely, then cataloged by AI (I'm sure remotely and after it's recorded) to be able to search and learn from it).

Then, the OS needs to be ironed out. Here, AVP is a great help. How will we instruct our AI glasses to provide more info about something? Will we be tapping in the air, fingers together, blinking, voice command, using our iPhones to gesture upon? Dunno.

I agree with some above that tethering seems the obvious first step, and I imagine Apple could deliver on such a version in the next 5-10 years, but think people will demand to have their iPhones cordless. So, maybe a first step is a battery, processing brick in a form factor like the AVP battery could work. Of course, they need to get the heads up display glass working first, and I've yet to see it.

Exciting that we have all techno know-how at this point, but we're not at the stage where it can all be integrated well. Reminds me of year 2000 PDAs and cellphones. We could all envision a product that did it all, but didn't seem feasible at the time...and we were right, it took 7 years. The capacitive LCD touchscreen and iOS were the big innovations.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.