Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,732
39,676


Apple plans to announce its long-rumored mixed reality headset during an event planned for January 2023, Apple analyst Ming-Chi Kuo said today.

apple-mixed-reality-headset-mockup-feature-orange.jpg

In a tweet, Kuo offered a more precise timeline for Apple's mixed reality headset, which despite speculation did not make an appearance during yesterday's WWDC keynote.

According to Kuo, Apple will hold an event in January to reveal the product, with tools for developers shipping "within 2-4 weeks after the event." Pre-orders for the headset will start in the second quarter of 2023 with customers able to purchase the headset before next year's WWDC.

Multiple reports have highlighted ongoing issues with the development of the headset, delaying a launch until next year. The headset is expected to be a niche device that costs somewhere around $3,000. Learn more about the headset using our roundup.

Article Link: Kuo: Apple to Hold Special Event in January to Announce Mixed Reality Headset
 
If true, it's difficult to imagine that the Freeform app they previewed isn't a big part of it, which I believe they said is coming later (not part of the OS launches).

I remember Tim Cook in an interview—maybe with Kara Swisher—saying something about AR and (paraphrasing), "Imagine if instead of talking we were right there with each other and I could pull up a presentation and show it to you."

It was not a very imaginative example (especially since iChat Theater used to have that exact feature before Apple got rid of it). But it did show where his mind was at in terms of the purpose of such a device.

I honestly didn't pay that close of attention to the keynote because it was kind of boring, but a number of things seemed to be circling around AR. For example, I still have no idea what spatial audio is (I don't have any devices that support it . . . so there's that). But anyhow, they were talking about the new personalized spatial audio. Again, I'm not really sure what it is, but it sounds like something you might want if creating an AR/VR product.
 
For 3,000$, this device should be absolutely amazing. Not good, AMAZING. Im skeptical that Apple can deliver a breakthrough experience never before seen in similar VR/AR headset already available for a fraction of that price. Something better? Definitely. But when you try to charge 3,000$ while Oculus Quest costs 1/10, it should be at least 10 times better. I really cant see with the current state of technology and what is possible, how this x10 factor could be achieved. Worst case scenario would be something as the AirPods Max. A product that is marginally better than what the competition is offering, but nothing special. I remember the hype before the release of Apple Watch. Everyone imagining a watch with big curved display around your wrist, and we basically received a half-baked product with limited functions and usefulness that should be charged each day and had some limited health and fitness functions.
 
Aside from Apple's approach to the design and features of this device, there's nothing here for most people to get excited about. It will cost a fortune and probably have limited applications, an expensive novelty for the hardened Apple fans.
 
They always have at least 2 announcements at Events, so why doesn't this "Special" event have other announcements other than the headset? Doesn't make sense when you think about it.
 
Didn’t iphone get announced in the month of January? In any event only 7 months away if that’s the case. Either way with leaks, patents and Tim interviews etc it’s close.
 
My guess is that Apple's AR will be a contextual based system that works with the headset, or your phone or device. Apps would be released to fill up each context.

Examples:
If you point it at your window, you'll see weather information.
If you point it to a bunch of ingredients, it will show you recipes you can make with those.
If you point it to a HomePod that's playing music, it will show you the album art, track info etc.
If you point it to a TV, it will show you information about the show / movie you're watching
If you point it to a person, they get a prompt on their watch/phone, and when they press it, you see their contact info
If you point it to an animal or plant, you'll get information about it
 
My guess is that Apple's AR will be a contextual based system that works with the headset, or your phone or device. Apps would be released to fill up each context.

Examples:
If you point it at your window, you'll see weather information.
If you point it to a bunch of ingredients, it will show you recipes you can make with those.
If you point it to a HomePod that's playing music, it will show you the album art, track info etc.
If you point it to a TV, it will show you information about the show / movie you're watching
If you point it to a person, they get a prompt on their watch/phone, and when they press it, you see their contact info
If you point it to an animal or plant, you'll get information about it

And if you're a cardiothoracic surgeon looking into the chest cavity of a patient you'll see an array of patient vital signs, enhanced visualization of organs, medical history that can be summoned up, documents supporting procedures, imagery from past procedures, collaborations with other surgeons, etc. This has been reality for a number of years.

Similar possibilities if you're a safety inspector of a nuclear power plant, a weekend mechanic working under the hood of their car trying to diagnose and fix a problem, an insurance adjuster assessing damage of a customer's vehicle, a gardener looking at their garden and assessing the health and condition of plants, and on and on and on. It's pretty limitless.

My guess it will be glasses tied to a user's iPhone via wireless UWB video data links. The AR compute horsepower will be the A-series cpu/gpu in the phone along with the battery to service that horsepower that already exists. Glasses will have a tiny battery with just enough capacity to handle the small displays and cameras.
 
If true, it's difficult to imagine that the Freeform app they previewed isn't a big part of it, which I believe they said is coming later (not part of the OS launches).

I remember Tim Cook in an interview—maybe with Kara Swisher—saying something about AR and (paraphrasing), "Imagine if instead of talking we were right there with each other and I could pull up a presentation and show it to you."

It was not a very imaginative example (especially since iChat Theater used to have that exact feature before Apple got rid of it). But it did show where his mind was at in terms of the purpose of such a device.

I honestly didn't pay that close of attention to the keynote because it was kind of boring, but a number of things seemed to be circling around AR. For example, I still have no idea what spatial audio is (I don't have any devices that support it . . . so there's that). But anyhow, they were talking about the new personalized spatial audio. Again, I'm not really sure what it is, but it sounds like something you might want if creating an AR/VR product.

I really think we have no idea what we're in for and get the feeling this will be one of those super rare '07 keynote moments that yield a distinct the way things were before the keynote and a the way things are now kind of moment. While consumer demand for an Apple-branded MR headset is no where near as strong as an Apple-branded mobile phone, the climate does feel similar(ish) to the climate leading up to that '07 keynote. There are MR headsets available right now. I've used them. And while fun, they do feel like they're just barely scratching the surface of possibility - almost to the point of being no more than a novelty. Something that some have been happy to explore wile the vast majority of consumers have been slow to adopt. And certainly do not feel compelled to explore. I think after the Apple MR keynote, that dynamic will flip and we'll see this category explode. Feels like it will be one of those classic "I had no idea I needed this" kind of moments/experiences/devices.

Spacial Audio and AirPods Max are 100% intended to enhance their MR* headset experience. So much so that I believe they will either offer an MR* headset and AirPod Max bundle that will unlock certain features only available to users with that specific combo. This is where that rumored $3k price tag comes into play. While I do not think it will end up that high (was planted more to get us worked up and make anything less than $3k seem like a steal), I do think there will be a higher price point for a bundled option and a lower price point for an MR headset-only option. Obviously, I have no idea. Just spit-balling. Will be interesting to see how this all plays out.



* In this case, MR ≠ MacRumors.
 
That price simply can't be right. Apple doesn't have a history of launching new device categories at a price point that is only for rich people.
 
  • Like
Reactions: AgentElliot007
I could be wrong, but for some reason I suspect we’re going to have an iPad price announcement moment when these hit.

“People said we would charge $2999, but…”

No guess on real price, but I don’t see how this is a product at $2999. I barely see this as a product at $999, but $999 as a starting point sounds way more on point to me than $2999 in terms of market viability. I really struggle with the idea of this being some niche $2999 device. Doesn’t seem like Apple, especially under Tim Cook, but who knows. Not much from Apple makes a ton of sense right now.
 
And if you're a cardiothoracic surgeon looking into the chest cavity of a patient you'll see an array of patient vital signs, enhanced visualization of organs, medical history that can be summoned up, documents supporting procedures, imagery from past procedures, collaborations with other surgeons, etc. This has been reality for a number of years.

Similar possibilities if you're a safety inspector of a nuclear power plant, a weekend mechanic working under the hood of their car trying to diagnose and fix a problem, an insurance adjuster assessing damage of a customer's vehicle, a gardener looking at their garden and assessing the health and condition of plants, and on and on and on. It's pretty limitless.

My guess it will be glasses tied to a user's iPhone via wireless UWB video data links. The AR compute horsepower will be the A-series cpu/gpu in the phone along with the battery to service that horsepower that already exists. Glasses will have a tiny battery with just enough capacity to handle the small displays and cameras.
And I would assume the entire thing will be intent based. The glasses do all the object detection instead of the apps. "Oh I've detected a window, which apps care about windows" and then the relevant apps are presented or in some cases, automatically open. This would prevent the apps from spying on you, all they know is "I have a this object, here are its specs"
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.