I thought Zeiss was making lenses for people that wear glasses that goes in the headset in front of the screen.I would assume you simply wear your glasses “inside” the goggles.
I thought Zeiss was making lenses for people that wear glasses that goes in the headset in front of the screen.I would assume you simply wear your glasses “inside” the goggles.
Apple has multiple projects going on at the same time. Some are going at a slow pace and are only likely picked based on trends in the market place or depending how much the executive teams feels they could make a dent in a particular segment. If you even looked at the presentation, many of the demos were solitary experiences with the exception of the guy in the workspace and the women in the living room interacting with each other. If you notice that the two features that seem to be still going through testing and were not available during the demos is eyesight and field of view. Those were likely last minute adds because they didn't see this needing much social interaction functions.This is an interesting take...I wonder if you are right about what pushed this forward.
I don’t think you’ll really be able to use a virtual 80 foot screen, because the Vision Pro has a fixed focus distance of about 6 feet.
That is definitely true about the spread of the iPad and uses for it. The question is would that success have happened if the iPad were worn on the face?Ya know, it's expensive as hell and no one really know what to do with it just yet.
That's what makes it so exciting. People are going to be completely AMAZED as what gets done with this thing 5 years from now when it's more than half the price it is now.
I can remember calling the iPad an "oversized smartphone that no one will do anything with" when it first came out. I said it would fail in the marketplace and no one would buy them.
Now they're literally used everywhere. The waitress at the restaurant takes my order with one, the Cinnabon at the mall uses theirs as a cash register. People in all walks of life and businesses are using them every day to be more productive and improve efficiency. Children use them to learn. It's really something that first started out as. Wait for it... The NEWTON. Became the iPhone, then the iPad and the whole world of Apple we have today.
So, I say knock this thing at the peril of looing short sighted. I think it's a game changer in the next five years for all sorts of things we haven't even thought of yet.
And the price will come down with time. It's a new product and a totally fresh take on an existing concept.
I'm totally excited as to what will be done with Apple Vision in the next 5-10 years.
Wondering if it would help to reduce the feeling of isolation for me if my friends sitting in the cafe with me and checking iMessages could mirror their screens to my iPhone …
Apple's upcoming Vision Pro headset will support screen mirroring via AirPlay or FaceTime, according to code found in the beta 4 release of visionOS 1.0.
![]()
Code in beta 4 includes the following strings:
This suggests that users will be able to mirror their Vision Pro display to an external monitor or TV, or share their view with others through AirPlay or FaceTime. Other headsets like Meta Quest have similar features which can help to reduce the feeling of isolation of the headset user from others.
- Select a device to mirror content to from your Apple Vision Pro
- Only one activity is available when mirroring or sharing your view through AirPlay or FaceTime.
Reset EyeSight Data
Apple's visionOS 1.0 beta 4 also includes a new option to reset EyeSight data. EyeSight is the feature that displays a user's simulated eyes on the external display of the Apple Vision Pro. The option reads:
- You can reset EyeSight by going to Settings > People Awareness and tapping Reset Personalized EyeSight. This will remove personalized eye details from EyeSight, like your eye shape and measurements, but EyeSight will still use your skin tone where available. After you have reset EyeSight, you can restore it by recapturing your Persona.
Sharing of Persona Data
The code indicates that a user's Persona will be sent to all participants in a FaceTime call in order to allow other participants to view the user's Persona. Apple Vision Pro can generate Personas via machine learning that allow users to share virtual representations of themselves that reflect face and hand movements in real time with others over FaceTime.
The code also indicates that Personas, but not the data used to generate them, could be stored on Apple's servers, albeit in a manner that isn't accessible to Apple.
- For FaceTime calls on a visionOS device, your Persona will be sent securely to all of the people on the call so they can view your Persona. After a call is completed, your Persona may remain stored encrypted on the other call participants’ devices for up to 30 days. The other call participants will be able to access your Persona only when they are on a call with you.
- To create your Persona and personalized EyeSight, Apple Vision Pro cameras capture images and 3D measurements of your face, head, upper body, and facial expressions. The data used to build your Persona and EyeSight do not leave your device. Your Persona may be stored on Apple servers, encrypted in a way that Apple cannot access.
Additional New Alerts
visionOS 1.0 beta 4 also includes the following new alerts:
- Calling unavailable while in Travel Mode
- Brighten your lighting to use your Persona.
- This video has excess motion, and could cause discomfort if expanded.
Apple has stated that Apple Vision Pro will launch in the U.S. in early 2024 before expanding to other countries at later dates.
Article Link: Apple Vision Pro to Support Screen Mirroring via AirPlay and Other Tidbits
You can see them and your environment, so no need to spy on them.Wondering if it would help to reduce the feeling of isolation for me if my friends sitting in the cafe with me and checking iMessages could mirror their screens to my iPhone …
Except the iPad was very well priced… way lower than people expected.
people expected 1000 it was released at 499.
Yeah, not sure they thought that through.A few seconds after this point in the video they show a huge screen movie playing and they say "make your screen feel a hundred feet wide"
just spent $ 60 k ( last week ) on a new set of speakers and my wife struggles to understand what I am going to do with itYa know, it's expensive as hell and no one really know what to do with it just yet.
it's amazing how people react to financial things when they search for an excuse either to buy or to not boy something.We get it, you can neither afford it nor imagine a use case for it.
less than a set of ordinary symmetrical NF XLR cables for my new record player 🤣🤣I wonder how much the corrective lenses will be for people who wear glasses.
Well, that's not all of it. It can also be a multi-monitor setup. And a way to enjoy spatial videos that you take with your iPhone,.So that’s it? $3,500 for a giant screen?
I think this shares an advantage that the iPad had when it was first introduced. It will be able to run thousands of iOS apps right out of the box. This means you won't have to switch devices to answer a call or pay a bill or post on social media, or trade stocks, or shop, or read the news.this is kind of devices will need APPs that run on it and it will take years to have an eco system of applications. At the beginning it is a technology demonstrator for people who want to afford it and for developers that need to explore the opportunities with it.
You don’t have to strap a mouse to your face though, big difference.Hey remember when mainframe guys mocked the mouse? They called it a frivolous toy and slower than command line interfaces. Remember that?
It’s wild how those enjoying one use case can’t get their heads around other use cases. Same as it ever was!
For example, because I can have a huge screen (or even multiple screens) anywhere in my room/flat/house/garden that won’t be an ugly black slab occupying space and collecting dust when not in use?!why would you strap a weight on your head to see the stuff you could see on your screen or airplay it on a bigger screen?
Are you talking about Apple who couldn't think of any use case except using it as a screen and TV replacement?It’s wild how those enjoying one use case can’t get their heads around other use cases. Same as it ever was!
Instead it will be an ugly AR/VR headset and battery brick occupying space and collecting dust about 30 days after purchase.For example, because I can have a huge screen (or even multiple screens) anywhere in my room/flat/house/garden that won’t be an ugly black slab occupying space and collecting dust when not in use?!
That virtual screen (or screens) can also be way bigger than what would be feasible to handle as physical goods in a given location.
And I won’t have to fight with reflections or displays with poor ergonomics *cough*iMacAndAppleMonitors*cough*
Ohhhhh, excuse us peasants here, my Lord.....where I live that's a car.$3500 really isn't a lot of money. But it is a lot of money for this device.
Found the audiophile! Well, I guess that’s the kind of audience for the AVP.less than a set of ordinary symmetrical NF XLR cables for my new record player 🤣🤣
Also, mice nowadays have much higher resolution than the AVP. 😀You don’t have to strap a mouse to your face though, big difference.