Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I know its first gen and everything but I was hoping for something more than 'floating iPad windows strapped to your face. With proper AR I want the room around me to become the interface, like every surface becomes an infinite whiteboard and monitor and I can pin things to physical spaces, eg my shopping list is just naturally floating on my fridge for me to add to. That would be proper spatial computing because it uses actual space.

That will come. The world of AR is pretty limitless. Developers/users with fertile imaginations will take AVP far with interesting apps.

It's going to be fun revisiting some of the negative comments from people going back the last five years about Apple entering this field.

Kind of like the "who asked for a 1,000 songs in your pocket" comments when SJ introduced iPod. And similar comments about Apple introducing iPhone/iPad/Watch/etc.
 
Last edited:
That is to say I want a reason to get off my ass and gaze at the fridge so it can tell me what is going off and what needs replacing rather than that just popping up as a notification in my field of view.
Maybe I'm missing something but ours definitely chimes/alerts when the filter needs replacing, the door didn't properly seal, or some other item needs our attention and we definitely need to walk over to it when that happens. Smarter, internal component failure alerts would be even better.
 
I like how the person that they use on the videos looks incredibly smug and looks like they are someone that would smugly let you know that they are one of the first people to experience the “spatial-computing(tm) revolution” while they are looking at you with creepy dead fake eyes while wearing their stupid VR space goggles.
That video is a Rorschach test; there may deeper emotional reasons why you feel this way.
 
I know its first gen and everything but I was hoping for something more than 'floating iPad windows strapped to your face. With proper AR I want the room around me to become the interface, like every surface becomes an infinite whiteboard and monitor and I can pin things to physical spaces, eg my shopping list is just naturally floating on my fridge for me to add to. That would be proper spatial computing because it uses actual space.
I’d be surprised if something like this doesn't happen.
 
You're not gonna walk to your fridge every time to read your shopping list though? That eliminates the entire advantage the digital world has over the physical world.

And even if you're in your kitchen you need to wear the Vision Pro, the only benefit over physical notes here is that you could have photos and interactive content — but that's hardly necessary for a shopping list.

I think his point is correct then. It's not 'spatial computing', it's just floating iPad screens in VR.

The iPad was obviously revolutionary to me because it launched with a keyboard and Pages; it was a great writing machine! From day one Apple marketed it as a line of computers rather than an accessory.

I reckon the endgame product is actually nothing like Apple's centrally controlled system at all but rather an open-source system that pulls in and parses data from IOT sensors all around you. The Find My network, Amazon's Sidewalk; these things are all around us and don't need a centralised server to run from. An open platform to parse the data, like how webkit can be used for different web browsers is the answer.

How do you be productive with this thing? If you have to talk to it, it's not appropriate for an office. If you have to use a keyboard, then it's just a $3500 monitor for one person with less than 2 hours battery life that you still need a mac for. If you're just using it to scroll through photos, or make facetime calls, then it's less useful or versatile than a mac or an iPhone.

Your praise for the iPad is based on it being a productivity machine, rather than an eReader. But so far no one online or any developers have announced an actual use for this thing that isn't better served by existing devices or traditional monitors. Apple will never get to an endgame product if the first (and subsequent) iterations fail because no one can figure out what to do with it in its current design.
 
I'm not saying this product is going to be revolutionary, but your comment feels like the comments when the iPad was introduced: "It's just a big iPhone".
As an interaction designer I'm looking more at the Vision Pro ushering in the next interaction paradigm: Spatial computing. Right now it's a clucky Quest 3, a little less clunky Vision Pro, or an awkward pair of sunglasses. This might evolve into every device tapping into the Spatial UI, like your oven knowing you're looking at it and floating a window 'in space', outside of the confines of the oven, literally connecting the world around you.
I think you two are essentially saying the same thing. Using different examples, and one with it a tad more positivity than the other, but both hoping for grand new opportunities in spatial computing.
 
  • Like
Reactions: klasma
I'm really struggling to get excited about this.

What I do see is a lot of people being unrealistic with their expectations saying/implying: "Yes, but just wait until these are the size of glasses or contacts!"
That I fear is very far off — if ever.

Mind you, at my age it's difficult to get excited about anything really… 🙂
 
Last edited:
I like how the person that they use on the videos looks incredibly smug and looks like they are someone that would smugly let you know that they are one of the first people to experience the “spatial-computing(tm) revolution” while they are looking at you with creepy dead fake eyes while wearing their stupid VR space goggles.
I know what that smugness is. A good example could be iPhone cases that have open circles over the Apple logo. Heck, even the free stickers feel kind of smug. But I just don’t see it here. Apple’s been very careful not to market Vision Pro as exclusive to the wealthy. They did a good job burying the first Apple Watch Edition. Further, Vision Pro is going to be an at-home and non-optical product. Are people going to be jerks and wear them at the grocery store? Maybe. But that’s them and I just don’t think it’s fair to say that’s her.
 
That will come. The world of AR is pretty limitless. Developers/users with fertile imaginations will take AVP far with interesting apps.

It's going to be fun revisiting some of the negative comments from people going back the last five years about Apple entering this field.

Kind of like the "who asked for a 1,000 songs in your pocket" comments when SJ introduced iPod. And similar comments about Apple introducing iPhone/iPad/Watch/etc.
Don't get me wrong, I think the Vision is an incredible piece of engineering, daft external eyes nonwithstanding. Its just we need to make sure that AR doesn't end up on the same pile as 3D TV and NFTs just because the initial vision was wrong.
 
You're not gonna walk to your fridge every time to read your shopping list though? That eliminates the entire advantage the digital world has over the physical world.

And even if you're in your kitchen you need to wear the Vision Pro, the only benefit over physical notes here is that you could have photos and interactive content — but that's hardly necessary for a shopping list.
There would be multiple ways to display and interact with the note. Ie. you could virtually pin a note in your Notes app to your fridge, but it would still exist in the Notes app.
 
How do you be productive with this thing? If you have to talk to it, it's not appropriate for an office. If you have to use a keyboard, then it's just a $3500 monitor for one person with less than 2 hours battery life that you still need a mac for. If you're just using it to scroll through photos, or make facetime calls, then it's less useful or versatile than a mac or an iPhone.

Your praise for the iPad is based on it being a productivity machine, rather than an eReader. But so far no one online or any developers have announced an actual use for this thing that isn't better served by existing devices or traditional monitors. Apple will never get to an endgame product if the first (and subsequent) iterations fail because no one can figure out what to do with it in its current design.

Road warrior on long flight in coach (often ME). There's not enough room in front of you to open up a laptop enough to use it well on your lap or tray table. But there is enough room for the keyboard + trackpad 'half'. Slip on these and get work done on the long flight instead of NOT getting any work done because there's simply not enough room to use the computer.

Yes, one could pay up for first class to get the space to use a laptop as intended... but it won't take much first class premium to cover the cost of owning Vpro.

It won't be "just browse photos or just FaceTime"... this thing can show our eyes ANYTHING as if we are actually there. How much do people pay to stare at a flat 2D screen strapped to a Peloton bike to sort of feel like they are riding in all those destinations? That's basically a narrow "tunnel vision" approach limiting your riding view to the width & height of that little screen. With this, the Peloton "pro" rider could look left, right, up, down and behind to see where they are riding, making that illusion of riding in such places seem far more real. If part of the fit-developing "productivity" on a Peloton is the illusion of being in such places, this should enhance that illusion many times over the "as is" experience.

Have an amateur taste of this right now with one of many VR 360˚ Bike Riding videos on YouTube. As this plays, click & hold on the video, then drag left, right, up & down etc so you can simulate looking in various directions...


Unlike the existing Peloton experience where you can only basically look "straight ahead" unless the person filming the experience decides to pan around for you, this will let you turn your head to look at the other riders, the attractive person jogging down the beach, the sunrise over the water, etc. Just look around as you can do when riding an actual bike and turning your head to look "over there", "up there", etc. You'll see what is over there... out of the frame of the "as is" Peloton small "window" on that locale.

In that video example, the rider has the camera strapped to the bike, so you can't look behind (else you are looking at the rider)... but professional applications for things like Peloton will likely shoot with a camera at roughly rider eye level, so that the experience is very much like actually riding and thus, able to look anywhere, including behind you... more like THIS example...


That very niche-y example can be spread to countless other applications where showing one's eyes anything makes the user that much more productive. Why do some people assemble 2, 4, 8 or more screens on a desk?

Multi-ScreenTrade.jpg

Apparently, they are- or feel- more productive with all of that information available to them at a glance. This makes that kind of productivity available anywhere one happens to be... instead of only that one spot with what might be a couple hundred pounds of physical screens, supports, wires, racks, desk, etc.

And on and on.
 
Last edited:
I wonder what the accessibility alternatives will be for these hand gestures. I have cerebral palsy and think I would find them hit and miss. It’s an area where Apple normally excel, so I’m optimistic.
I wonder this too. I would think some other device might be necessary. But then portability might be majorly hampered unfortunately.
 
I'm really struggling to get excited about this.

What I do see is a lot of people saying/implying: "Yes, but just wait until these are the size of glasses or contacts!"
That I fear is very far off.

Mind you, at my age it's difficult to get excited about anything really… 🙂

This system can’t be reduced to the size of glasses. That requires a completely different approach. Something like this:

 
To set up your Persona, you'll remove Apple Vision Pro to capture your appearance. […]
For a device that supposedly should be more intuitive to use, I think they are overcomplicating things for normal users with that Persona concept. I get that they are doing this because the 3D rendering of your head/eyes won’t quite look like the real you, so it’s your “Persona” rather than just you, but I think that’s a distinction a regular user doesn’t need to be confronted with by a confusing new term during setup. They should just say something like “your face is going to be captured so it can be used in video calls and on the front display”.
 
Thats just a very basic usage case though. As and when this product gets refined down to glasses-level it needs to be less a device that pushes out data from its own apps and more something pulls it in from IOT objects around it. We want to use AR to use digital to enhance the physical space, not the other way around. That is to say I want a reason to get off my ass and gaze at the fridge so it can tell me what is going off and what needs replacing rather than that just popping up as a notification in my field of view. From a software design standpoint we dont want to overload the physical viewpoint of the users; it should only react to things being physically gazed at.

Kind of like how the Focus system works on Horizon Zero Dawn.
I agree information context is often very important. That’s something that spatial computing in AR offers in a unique and potentially powerful way.
 
For a device that supposedly should be more intuitive to use, I think they are overcomplicating things for normal users with that Persona concept. I get that they are doing this because the 3D rendering of your head/eyes won’t quite look like the real you, so it’s your “Persona” rather than just you, but I think that’s a distinction a regular user doesn’t need to be confronted with by a confusing new term during setup. They should just say something like “your face is going to be captured so it can be used in video calls and on the front display”.
So 20 words instead of "Persona"?

They probably explain what the Persona is.
 
I know what that smugness is. A good example could be iPhone cases that have open circles over the Apple logo. Heck, even the free stickers feel kind of smug. But I just don’t see it here. Apple’s been very careful not to market Vision Pro as exclusive to the wealthy. They did a good job burying the first Apple Watch Edition. Further, Vision Pro is going to be an at-home and non-optical product. Are people going to be jerks and wear them at the grocery store? Maybe. But that’s them and I just don’t think it’s fair to say that’s her.
I don’t feel like a clear case that protects your phone, lets you use one of its features, and lets you see the color you picked out instead of hiding it qualifies someone as smug.

A product that cost more than what the majority of people make in a month or two (starting out) is definitely gonna bring out the smug ones. Check out the Vision Pro forums on here and see some of it yourself.

As for people walking around in public with it on I guarantee you see that, day one even. Some to try and prove that it isn’t this vehicle to isolate us all even further, others to let you know they spent close to if not more than $4000 for their space goggles.
 
  • Like
Reactions: decypher44
For a device that supposedly should be more intuitive to use, I think they are overcomplicating things for normal users with that Persona concept. I get that they are doing this because the 3D rendering of your head/eyes won’t quite look like the real you, so it’s your “Persona” rather than just you, but I think that’s a distinction a regular user doesn’t need to be confronted with by a confusing new term during setup. They should just say something like “your face is going to be captured so it can be used in video calls and on the front display”.
A ‘persona’ isn’t an Apple invented thing or a ‘confusing new term’. It’s a word in the dictionary. If you’re confused by that thats on you, don’t tar everyone with the same brush!

 
  • Disagree
Reactions: decypher44
A ‘persona’ isn’t an Apple invented thing or a ‘confusing new term’. It’s a word in the dictionary. If you’re confused by that thats on you, don’t tar everyone with the same brush!
I’m not confused, but my non-techy parents and siblings certainly would. Yes it’s a word in the dictionary, but that doesn’t tell you what it means in the Vision Pro context.
 
  • Like
Reactions: decypher44
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.