…Now you’re being woefully tech ignorant.
“Spatial computing” is NOT a term Apple created being a term in human-computer-interaction (HCI) computer science and academia.
It’s the formal term to include dedicated spatial computing or extended reality (XR) devices and features related to spatial computing added to traditional computing devices like spatial video recording on iPhones.
Spatial computing headsets and glasses have distinct trade-offs to always co-exist with one another like like desktops and laptops and phones. Some will have both justifiably.
Advanced and the most demanding spatial computing form factors will asked be more appropriate and best done on a headset; in-the-moment and the most convenient on-the-go spatial computing use cases will be best done on glasses/contacts.
XR glasses are more approachable and will be what most average people gravitate towards—people with certain hair fundamentally won’t use a headset after all.
However the most closer in horsepower glasses are to headsets, the more expensive it will be beyond the equivalent headset.
That’s exact how laptops and handhelds work compared to desktops and consoles.
Glasses as powerful, sharp, or capable in spatial computing as a Vision Pro today would have a higher price than the Vision Pro.
Because the approachability XR glasses will be more pursued with prosumer and mainstream tiers while headsets will be enthusiast-oriented when it comes to a good XR experience.
Medicore headsets like Meta’s with severe compromises will exist for reasons not related to proliferating a good spatial computing experience with them taking a longterm approach to minimize costs and deterring smaller companies that don’t have successes of entirely different industries to bankroll a price loss leader SKU (IMO, government should convene to defer such behavior)
If Meta approaches XR glasses exactly like that, many who like the device category will likely gravitate towards higher-end prosumer models with similar rhetoric on manufacturers who actually provide that compared to Meta similar to the Vision Pro vs the medicore Quest headsets.
Quest headers that don’t even have HDR to be on par with non-XR hardware nor have the horsepower to play AAA games gamers want to play on consoles that cost less or as much
As applicable to 3D space
In the early 1990s, as field of Virtual reality was beginning to be commercialized beyond academic and military labs, a startup called Worldesign in Seattle used the term Spatial Computing to describe the interaction between individual people and 3D spaces, operating more at the human end of the scale than previous GIS examples may have contemplated. The company built a CAVE-like environment it called the Virtual Environment Theater, whose 3D experience was of a virtual flyover of the Giza Plateau, circa 3000 BC. Robert Jacobson, CEO of Worldesign, attributes the origins of the term to experiments at the Human Interface Technology Lab, at the University of Washington, under the direction of Thomas A. Furness III. Jacobson was a co-founder of that lab before spinning off this early VR startup.
In 1997, an academic publication by T. Caelli, Peng Lam, and H. Bunke called "Spatial Computing: Issues in Vision, Multimedia and Visualization Technologies" introduced the term more broadly for academic audiences.
========
Yes it's not an Apple created term and it does have its own history in the scientific world.
Apple just borrowed the term to describe apps with a 3rd axis that you can see wearing AR headsets. Spatial also is associated with audio immersive codecs for example Dolby Atmos that has been around since 2012. Sony labeling things like 3D spatial editing makes more sense then claiming AR is Spatial computing by Apple to avoid comparisons to other AR technology predessors and competitors.
Tim Cook consistently emphasizes "spatial computing" to describe the mixed reality headset. While the term might technically fit, it feels a bit overplayed, and here’s why.
Apple has a history of coining new terms to make their products stand out, and "spatial computing" is no different. Other companies typically refer to similar tech as AR or VR, but Apple tries to reframe it with this new label to create a sense of novelty.
It feels like a deliberate branding effort to make the Vision Pro seem more revolutionary than it is, especially given that similar devices are already on the market.
The term "spatial computing" is quite broad, covering a range of technologies that interact with physical space, including VR, AR, and gesture control. While Vision Pro undoubtedly pushes these boundaries, it doesn’t fully live up to the scope suggested by the term. Apple may be aiming for a future where digital and physical worlds merge seamlessly, but at launch, Vision Pro feels more like a high-end AR/VR device rather than the all-encompassing future of "spatial computing."
"Spatial computing" implies a fluid, seamless experience of interacting with digital elements in your physical environment. In reality, wearing a bulky headset, even as advanced as Vision Pro, still feels like an isolated, tech-heavy experience. Until the tech evolves further, this gap between the promise of "spatial computing" and the actual user experience makes the term oversold.
In short, when Tim constantly uses „spatial computing“,
it is marketing BS. The AVP is not so different from existing headsets and Tim tries to tell you a story of a new computing era - which it is not.
To do so he uses „spatial computing“ and avoids terms like AR/VR.
So it is not the term that bothers me, but Tims approach to reframe existing technology as novelty.