Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Regarding virtual workspace use case, what I said in the comment you quoted stands. They are still using pixels. They may be smaller pixels than the competition for all I know, but the math regarding how much of the virtual environment is represented by a pixel (even a small one) 3 inches from your eye still stands. Any illusions of having several high resolution (>=1440p) virtual monitors are out the window, as I stated in a number of comments.

I think this is the reason Apple is showing floating windows for individual apps (or app tabs, modes, etc.), not a full screen workspace as you would have in a monitor. In a traditional monitor (4K for example) you would have multiple such app windows on the workspace 18-24 inches from your face and everything is super sharp. Rendering such a workspace to appear 18-24 inches in front of you is not likely to be as sharp, even with Apple’s higher res displays.

I know the pitch is now “the entire room is your workspace, not just a monitor” but if you have effectively the same amount of functional real estate for applications because they are rendered rather large in the environment or you have to choose one “foreground” app to render sharply, I just don’t see this as a huge leap forward from just having a couple of physical monitors.

If you need to do is match the angular resolution of the eye. The eye's angular resolution is about one minute of arc or about 0.0003 radians. This works out to about 0.0003 meters at 1 meter distance. Or let's say 1/3rd of a millimeter at one meter. A screen at one meter distance would need about 6 pixels per millimeter to show two points that are separated by 0.0003 radians. I'd argue that no desktop monitor is this good. We can clearly get away with four times less than perfect resolution. My 27" 4K screen has only "half perfect" resolution and I don't see a need to upgrade to 8K.

So I don't think we need to match perfect 20/20 vision.

The best use of these goggles is going to be "telepresence" where people who are separated by distance can all meet virtually in the same space and maybe talk about objects like the design of a proposed building. The architect presents his 3D model of the building and everyone can see it and even walk around inside the building and can see the other people there inside the building and they can talk about what they all can see.

I do construction on a small scale. Today I need to work hard to tell a client she needs to up her budget $15K for better materials or to spend another 6K moving a wall or maybe another $5K for moving plumbing pipes. Today I need to make a sketch on paper with colored pencils or a computer rendering and use a LOT of words to describe how it will look. It would be so much better to let the client put on the goggles and we both walk around in the new space.

But I doubt my small projects could support the cost of creating a virtual space. But maybe in time, some kind of AI assistant could do that based on my sketches and verbal input.
 
I disagree , the VR/AR experience that apple showed can be used for businesses , help expand work flow and education , can be used to with jobs , you can get a lot more productivity done through the use of expanded screens and interactive experiences of viewing internet, objects etc

Car .. we upgrade the car to what ? More buttons on a dash board , that no one uses because we all still listen to radios, car steering wheels have been clustered with buttons , the touch screen dash boards have become clustered , and your still limited by laws , more and more laws come out with use of days boards and driver rules. The car has peaked it’s not a revolutionary thing anymore. Other than making the car more comfortable for drivers it’s impossible to change it when you have limitations by law.
We use two monitors and that works perfectly fine. And they're $250 a piece. We're using a POS and a web browser, not doing surgery.
 
The best use of these goggles is going to be "telepresence" where people who are separated by distance can all meet virtually in the same space and maybe talk about objects like the design of a proposed building.
While tgat is one use, I think it’s likely to find use in arras where you can run scenarios on an individual or team basis. For example, a pro football team could have the QB “play” against various defenses without having to have other players involved. Scenarios could included busted plays, defense shifts, etc. a coach could “run” the defense and offense much like a video game to react real time.

On an industrial scale, a robot could access places inaccessible or dangerous for humans while providing the “you atr there” feeling.

Telemedicine, with some sensory inputs is another.

$3500 for the advancement over existing tech is nothing.
 
If you need to do is match the angular resolution of the eye. The eye's angular resolution is about one minute of arc or about 0.0003 radians. This works out to about 0.0003 meters at 1 meter distance. Or let's say 1/3rd of a millimeter at one meter. A screen at one meter distance would need about 6 pixels per millimeter to show two points that are separated by 0.0003 radians. I'd argue that no desktop monitor is this good. We can clearly get away with four times less than perfect resolution. My 27" 4K screen has only "half perfect" resolution and I don't see a need to upgrade to 8K.

So I don't think we need to match perfect 20/20 vision.

The best use of these goggles is going to be "telepresence" where people who are separated by distance can all meet virtually in the same space and maybe talk about objects like the design of a proposed building. The architect presents his 3D model of the building and everyone can see it and even walk around inside the building and can see the other people there inside the building and they can talk about what they all can see.

I do construction on a small scale. Today I need to work hard to tell a client she needs to up her budget $15K for better materials or to spend another 6K moving a wall or maybe another $5K for moving plumbing pipes. Today I need to make a sketch on paper with colored pencils or a computer rendering and use a LOT of words to describe how it will look. It would be so much better to let the client put on the goggles and we both walk around in the new space.

But I doubt my small projects could support the cost of creating a virtual space. But maybe in time, some kind of AI assistant could do that based on my sketches and verbal input.
You are correct that for actual workspaces, we are happy to deal with monitors with resolution that is nowhere close to the resolving power of our eyes. My response was more about the use case brought up often prior to the reveal where many thought they could effectively have multiple hi-res “virtual monitors” with the headset. This, IMHO, was not going to happen without a paradigm change in image generation, and I saw nothing in the reveal to make me think otherwise. It all looks very “cool” but I don’t see it enhancing productivity much at all for most workers who do things on a monitor or two today.

I’ve always acknowledged that this device has potential for specific professional (read: low volume, high cost) use cases like some of the examples you’ve cited. But I still don’t see a “killer app” that would make it worthwhile for the masses. Perhaps that will come since it is clearly aimed at developers at the moment.
 
I disagree , the VR/AR experience that apple showed can be used for businesses , help expand work flow and education , can be used to with jobs , you can get a lot more productivity done through the use of expanded screens and interactive experiences of viewing internet, objects etc

Car .. we upgrade the car to what ? More buttons on a dash board , that no one uses because we all still listen to radios, car steering wheels have been clustered with buttons , the touch screen dash boards have become clustered , and your still limited by laws , more and more laws come out with use of days boards and driver rules. The car has peaked it’s not a revolutionary thing anymore. Other than making the car more comfortable for drivers it’s impossible to change it when you have limitations by law.
I still have to drive to work, dude. I would love to have an Apple car. City folk always take a dump on cars with their fancy trains, but I am in the middle of nowhere PA. My commute is an hour one way. Listening to music on better speakers with cooler buttons and stuff would make my commute more enjoyable. The only train nearby is carrying freight. I work in retail, used to be Voice Engineering. When I go home, I play my playstation or my PC. I watch Youtube videos while gaming.

I don't need or want one of these things, and I am the target demographic. A lot of people wouldn't need one even if they could afford $3499, which is the price of a car up here. It's an over engineered product desperately forcing a market that Oculus and Sony are both struggling to grow past early adopters, geeks, and CHILDREN.
 
If the cost of this technology decreases and it becomes as commonly used as smartphones are now, it's possible that future apartments may not have designated areas for television mounts or hanging TVs. Instead, there may be blank white spaces where individuals can project their augmented reality headsets to complete tasks. That would be insane.
 
Regarding virtual workspace use case, what I said in the comment you quoted stands. They are still using pixels. They may be smaller pixels than the competition for all I know, but the math regarding how much of the virtual environment is represented by a pixel (even a small one) 3 inches from your eye still stands. Any illusions of having several high resolution (>=1440p) virtual monitors are out the window, as I stated in a number of comments.

I think this is the reason Apple is showing floating windows for individual apps (or app tabs, modes, etc.), not a full screen workspace as you would have in a monitor. In a traditional monitor (4K for example) you would have multiple such app windows on the workspace 18-24 inches from your face and everything is super sharp. Rendering such a workspace to appear 18-24 inches in front of you is not likely to be as sharp, even with Apple’s higher res displays.

I know the pitch is now “the entire room is your workspace, not just a monitor” but if you have effectively the same amount of functional real estate for applications because they are rendered rather large in the environment or you have to choose one “foreground” app to render sharply, I just don’t see this as a huge leap forward from just having a couple of physical monitors.

Yeah I'm really curious how that works out too. Excited to hopefully be able to test it next year! (or maybe via a dev-kit)

I've been observing a bit more conscious how I work on my wide screen Eizo monitor (3840x1600, so about 6K-pixels). Effectively I'm never lookin at the complete screen. My focus is just on a window or part of a window. I either move my eyes or head to shift focus between apps/windows/components. So if my monitor would infinitely extend in all directions, I would basically have the same effect, expect for the background and spacial effect. The part of the screen I'm looking at is at best maybe 2-3K pixels, never the full 6K. So 4K of more tiny pixels might very well compensate that. From what I've been reading from reviewers, text remains clear and crips under all conditions so it at least sounds that Apple might have nailed that. For now I have my hopes up that this thing really delivers what is needed to act as multi-monitor replacement. But we'll see, defiantly look forward experiencing it myself!
 
government offices, universities campuses, hospitals areas? no.
Yes, you will see these in hospitals. VR is ideal for radiology. I can imagine a physician using hand motions to slice open an organ and turn it around to get a better view. It is a good way to look at CAT scan data.

Another hospital use case is robotic surgery. It is inherently 3D but today they use a real-time flat-screen monitor.

As for universities, maybe not in the admin building but certainly you will find these in some labs. Any lab where 3D structure is important. Maybe some biochemistry or robotics research lab. In government, only in the same kind of labs as in universities.

These will be used by architects to show clients what the space will look like and allow a client to do a "walk-through" of a design before it gets finalized. Architects themselves don't need this, they can 'see" the final space by looking at the 2D drawings. But this is a professional skill most clients lack

Finally, this is only the beginning. In ten years the goggles and the software will be much better.

The goggles might also help with certain kinds of training, perhaps training tank commanders. The current tank simulators use a ring of flat monitors around the upper hatch, A $3500 VR goggle would actually save money vs. the current system. Perhaps with pilot training.

But sadly, the most common use case might be rich gamers who like to shoot virtual machine guns at virtual monsters.
 
  • Like
Reactions: cpnotebook80
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.