Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yeah they should have taken the Apple way releasing the same phone every year - again.
They didn’t even have to take the Apple way, there are 250 smartphone companies in the world, they could have taken ANY of their ways! Especially since their phone was made by HTC, just learn from them and improve.
 
  • Like
Reactions: lilkwarrior
It is THE Vision Pro killer
…It’s not a headset nor an actual product.

spatial computing (XR) glasses is a completely different category of spatial computing that will coexist with headsets.

Just like desktops vs laptops vs mobile—the most demanding and highest-end spatial computing use cases will always be best feasible on a headset.

The most convenient, portable/in-the-moment spatial computing use cases will be best done on XR glasses.

Many will justify and have both
 
  • Like
Reactions: drrich2
And no history of shipping cutting edge, state-of-the art tech hardware at all.

I swear, some people's desire to attack Apple for the Vision Pro leads them to lose all common sense. We get you don't like the product (or maybe are upset you can't afford one), but that doesn't mean all of a sudden Meta has figured out how to ship a $10k futuristic prototype device for $300 later this year.

Don't get me wrong; the prototype looks very impressive, and the couple of people who I trust who used one say it's great, but everything I read is that the screen tech is absolutely nowhere close to being able to be produced at scale. So we're likely talking years before anything you'll be able to buy. And I will repeat that if you don't think Apple has something like this in their labs you're deluding yourself. But Apple is never going to show a prototype. Hell, they had police raid Gizmodo's editor's apartment for sharing a phone that was coming out in a month and a half.
Do you consider the Quest 3 cutting edge for $500? Or is it only possible to be cutting edge at $3500?
 
  • Like
Reactions: AsarTheGod
…It’s not a headset nor an actual product.

spatial computing (XR) glasses is a completely different category of spatial computing that will coexist with headsets.

Just like desktops vs laptops vs mobile—the most demanding and highest-end spatial computing use cases will always be best feasible on a headset.

The most convenient, portable/in-the-moment spatial computing use cases will be best done on XR glasses.

Many will justify and have both
„spatial computing“ aka Tim Cooks marketing BS? Never existed! The AVP is nothing else than VR googles - supercharged with a ton of money. You can take any existing VR goggle, add the super high resolution displays, an expensive camera array AND the annoying and cumbersome setup process and there you go. And yes it lives within the Apple ecosystem.
There isn’t even new technology built in. Basically it is an iPhone strapped to your head.

And yes, it will fail the same way all the other VR solutions failed before (to be honest, it failed already).

The orion glasses are what people want - and it is a product like the AVP with the only difference that Apple decided to throw it on the market, while Meta decided that it is still to expensive and can be improved further (the same thing Apple employees were saying about the AVP, but Tim wanted to present at least a single new product during his career).

We will see products like the Orion glasses. Maybe in three, 5 or maybe 10 years. But I’m convinced that Meta would sell more Orion glasses for 10K than Apple sells AVP for 3.5k right now.
 
Last edited:
„spatial computing“ aka Tim Cooks marketing BS? Never existed! The AVP is nothing else than VR googles - supercharged with a ton of money. You can take any existing VR goggle, add the super high resolution displays, an expensive camera array AND the annoying and cumbersome setup process and there you go. And yes it lives within the Apple ecosystem.
There isn’t even new technology built in. Basically it is an iPhone strapped to your head.

And yes, it will fail the same way all the other VR solutions failed before (to be honest, it failed already).

The orion glasses are what people want - and it is a product like the AVP with the only difference that Apple decided to throw it on the market, while Meta decided that it is still to expensive and can be improved further (the same thing Apple employees were saying about the AVP, but Tim wanted to present at least a single new product during his career).

We will see products like the Orion glasses. Maybe in three, 5 or maybe 10 years. But I’m convinced that Meta would sell more Orion glasses for 10K than Apple sells AVP for 3.5k right now.
…Now you’re being woefully tech ignorant.

“Spatial computing” is NOT a term Apple created being a term in human-computer-interaction (HCI) computer science and academia.

It’s the formal term to include dedicated spatial computing or extended reality (XR) devices and features related to spatial computing added to traditional computing devices like spatial video recording on iPhones.

Spatial computing headsets and glasses have distinct trade-offs to always co-exist with one another like like desktops and laptops and phones. Some will have both justifiably.

Advanced and the most demanding spatial computing form factors will asked be more appropriate and best done on a headset; in-the-moment and the most convenient on-the-go spatial computing use cases will be best done on glasses/contacts.

XR glasses are more approachable and will be what most average people gravitate towards—people with certain hair fundamentally won’t use a headset after all.

However the most closer in horsepower glasses are to headsets, the more expensive it will be beyond the equivalent headset.

That’s exact how laptops and handhelds work compared to desktops and consoles.

Glasses as powerful, sharp, or capable in spatial computing as a Vision Pro today would have a higher price than the Vision Pro.

Because the approachability XR glasses will be more pursued with prosumer and mainstream tiers while headsets will be enthusiast-oriented when it comes to a good XR experience.

Medicore headsets like Meta’s with severe compromises will exist for reasons not related to proliferating a good spatial computing experience with them taking a longterm approach to minimize costs and deterring smaller companies that don’t have successes of entirely different industries to bankroll a price loss leader SKU (IMO, government should convene to defer such behavior)

If Meta approaches XR glasses exactly like that, many who like the device category will likely gravitate towards higher-end prosumer models with similar rhetoric on manufacturers who actually provide that compared to Meta similar to the Vision Pro vs the medicore Quest headsets.

Quest headers that don’t even have HDR to be on par with non-XR hardware nor have the horsepower to play AAA games gamers want to play on consoles that cost less or as much
 
„spatial computing“ aka Tim Cooks marketing BS? Never existed!
As applicable to 3D space
In the early 1990s, as field of Virtual reality was beginning to be commercialized beyond academic and military labs, a startup called Worldesign in Seattle used the term Spatial Computing to describe the interaction between individual people and 3D spaces, operating more at the human end of the scale than previous GIS examples may have contemplated. The company built a CAVE-like environment it called the Virtual Environment Theater, whose 3D experience was of a virtual flyover of the Giza Plateau, circa 3000 BC. Robert Jacobson, CEO of Worldesign, attributes the origins of the term to experiments at the Human Interface Technology Lab, at the University of Washington, under the direction of Thomas A. Furness III. Jacobson was a co-founder of that lab before spinning off this early VR startup.

In 1997, an academic publication by T. Caelli, Peng Lam, and H. Bunke called "Spatial Computing: Issues in Vision, Multimedia and Visualization Technologies" introduced the term more broadly for academic audiences.
========

Yes it's not an Apple created term and it does have its own history in the scientific world.

Apple just borrowed the term to describe apps with a 3rd axis that you can see wearing AR headsets. Spatial also is associated with audio immersive codecs for example Dolby Atmos that has been around since 2012. Sony labeling things like 3D spatial editing makes more sense then claiming AR is Spatial computing by Apple to avoid comparisons to other AR technology predessors and competitors.
 
Last edited:
If these are the "nice things" you're talking about then I'm totally fine with that, you can keep them.

These are a step in the right direction for the face-computer form-factor but if you can't see how chunky they are then you need real glasses more than you need AR ones.
guy who does nothing complaining about people who do something. if you cannot comprehend the incremental progression of technology, i don't understand what you're doing on a tech forum.
 
guy who does nothing complaining about people who do something. if you cannot comprehend the incremental progression of technology, i don't understand what you're doing on a tech forum.
I like tech. I don't put a lot of stock in tech demos to pan out the way the companies say they will.

See all the missed promises of self-driving cars. See the fancy but limited Apple Vision Pro. See AirPower. See the countless AR glasses companies that have come and gone without ever selling a useable product. I see plenty of progress in a direction I think is probably the correct one, but I don't see a viable product or a reason to believe this will pan out any differently to the dozens of other times I've heard this promise.

Talk to me when Facebook puts a pair on sale and I can evaluate them for myself.
 
…Now you’re being woefully tech ignorant.

“Spatial computing” is NOT a term Apple created being a term in human-computer-interaction (HCI) computer science and academia.

It’s the formal term to include dedicated spatial computing or extended reality (XR) devices and features related to spatial computing added to traditional computing devices like spatial video recording on iPhones.

Spatial computing headsets and glasses have distinct trade-offs to always co-exist with one another like like desktops and laptops and phones. Some will have both justifiably.

Advanced and the most demanding spatial computing form factors will asked be more appropriate and best done on a headset; in-the-moment and the most convenient on-the-go spatial computing use cases will be best done on glasses/contacts.

XR glasses are more approachable and will be what most average people gravitate towards—people with certain hair fundamentally won’t use a headset after all.

However the most closer in horsepower glasses are to headsets, the more expensive it will be beyond the equivalent headset.

That’s exact how laptops and handhelds work compared to desktops and consoles.

Glasses as powerful, sharp, or capable in spatial computing as a Vision Pro today would have a higher price than the Vision Pro.

Because the approachability XR glasses will be more pursued with prosumer and mainstream tiers while headsets will be enthusiast-oriented when it comes to a good XR experience.

Medicore headsets like Meta’s with severe compromises will exist for reasons not related to proliferating a good spatial computing experience with them taking a longterm approach to minimize costs and deterring smaller companies that don’t have successes of entirely different industries to bankroll a price loss leader SKU (IMO, government should convene to defer such behavior)

If Meta approaches XR glasses exactly like that, many who like the device category will likely gravitate towards higher-end prosumer models with similar rhetoric on manufacturers who actually provide that compared to Meta similar to the Vision Pro vs the medicore Quest headsets.

Quest headers that don’t even have HDR to be on par with non-XR hardware nor have the horsepower to play AAA games gamers want to play on consoles that cost less or as much

As applicable to 3D space
In the early 1990s, as field of Virtual reality was beginning to be commercialized beyond academic and military labs, a startup called Worldesign in Seattle used the term Spatial Computing to describe the interaction between individual people and 3D spaces, operating more at the human end of the scale than previous GIS examples may have contemplated. The company built a CAVE-like environment it called the Virtual Environment Theater, whose 3D experience was of a virtual flyover of the Giza Plateau, circa 3000 BC. Robert Jacobson, CEO of Worldesign, attributes the origins of the term to experiments at the Human Interface Technology Lab, at the University of Washington, under the direction of Thomas A. Furness III. Jacobson was a co-founder of that lab before spinning off this early VR startup.

In 1997, an academic publication by T. Caelli, Peng Lam, and H. Bunke called "Spatial Computing: Issues in Vision, Multimedia and Visualization Technologies" introduced the term more broadly for academic audiences.
========

Yes it's not an Apple created term and it does have its own history in the scientific world.

Apple just borrowed the term to describe apps with a 3rd axis that you can see wearing AR headsets. Spatial also is associated with audio immersive codecs for example Dolby Atmos that has been around since 2012. Sony labeling things like 3D spatial editing makes more sense then claiming AR is Spatial computing by Apple to avoid comparisons to other AR technology predessors and competitors.

Tim Cook consistently emphasizes "spatial computing" to describe the mixed reality headset. While the term might technically fit, it feels a bit overplayed, and here’s why.

Apple has a history of coining new terms to make their products stand out, and "spatial computing" is no different. Other companies typically refer to similar tech as AR or VR, but Apple tries to reframe it with this new label to create a sense of novelty. It feels like a deliberate branding effort to make the Vision Pro seem more revolutionary than it is, especially given that similar devices are already on the market.


The term "spatial computing" is quite broad, covering a range of technologies that interact with physical space, including VR, AR, and gesture control. While Vision Pro undoubtedly pushes these boundaries, it doesn’t fully live up to the scope suggested by the term. Apple may be aiming for a future where digital and physical worlds merge seamlessly, but at launch, Vision Pro feels more like a high-end AR/VR device rather than the all-encompassing future of "spatial computing."


"Spatial computing" implies a fluid, seamless experience of interacting with digital elements in your physical environment. In reality, wearing a bulky headset, even as advanced as Vision Pro, still feels like an isolated, tech-heavy experience. Until the tech evolves further, this gap between the promise of "spatial computing" and the actual user experience makes the term oversold.


In short, when Tim constantly uses „spatial computing“, it is marketing BS. The AVP is not so different from existing headsets and Tim tries to tell you a story of a new computing era - which it is not.
To do so he uses „spatial computing“ and avoids terms like AR/VR.

So it is not the term that bothers me, but Tims approach to reframe existing technology as novelty.
 
no other headsets do what vision pro does. it'd be silly to frame avp as a vr headset
But - it is. There is almost no difference. Both systems give you the feeling to look through cameras and there is no „spatial computing“, it is very artificial.
And then go and take away $3000 from the AVP for the displays, cameras and sensors.


 
Last edited:
…It’s not a headset nor an actual product.

spatial computing (XR) glasses is a completely different category of spatial computing that will coexist with headsets.

Just like desktops vs laptops vs mobile—the most demanding and highest-end spatial computing use cases will always be best feasible on a headset.

The most convenient, portable/in-the-moment spatial computing use cases will be best done on XR glasses.

Many will justify and have both
Laptop indeed killed most desktop market, right?
 
Nope… as long as they require a Facebook account to be attached it’s a big nope for me. Zuck can get a wild hair up his backside and shut down the Facebook account attached turning an expensive pair of glasses into a paperweight
 
But it doesn't work as a watch. It's required every time I put on the glasses. So then I'd have to put on a watch in addition to this wristband.

These are also internal prototypes that they decided to make a bunch of to show off. Not like this is anywhere near a final or finished product.

Adding watch features to the band is trivial compared to making the rest of these a viable product.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.