Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience."
This makes zero sense. AVP already has everything mapped around it. Kuo is just being confused here but he also wrote another point that isn't mentioned here on MacRumors.
The IR camera can detect environmental image changes, potentially enabling in-air gesture control to enhance human-device interaction. It is worth noting that Apple has filed related patents in this area.
This seems like a more believable use-case but is Apple really going to put extra hardware in the small, light and power-efficient AirPods so we can do weird gestures with our hands? Either there's a good use-case for these cameras or Kuo is just wrong about this.
 
Yeah... no, I'm calling shenanigans on this one. First of all, an IR camera/sensor on AirPods could only be side-facing. Not sure what the benefits of that would be, especially in relationship to what you're viewing on the Vision Pro. Also, as others have said, AirPods Pro already have excellent gyroscopic head tracking, so I doubt side-facing IR cameras could add anything to that experience.
 


Apple will launch new AirPods featuring infrared cameras to improve spatial experiences with the Vision Pro headset, according to Apple analyst Ming-Chi Kuo.

airpods-vision-pro.jpg

In a post on Medium, Kuo explained that Apple plans to mass-produce new AirPods with integrated IR camera modules by 2026. These IR cameras will apparently be similar to the iPhone's Face ID receiver.

The purpose of IR cameras on AirPods is related to Apple Vision Pro and future headsets from the company, enhancing the spatial computing and audio experience. The IR cameras can detect environmental image changes, facilitating a broader range of gestures to improve user interaction. For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience."

Foxconn is said to be the supplier of the new IR component, with the Taiwanese electronics manufacturer preparing to provide enough parts for about 10 million AirPods initially. It is not clear which AirPods model is likely to get the feature first, but the AirPods Pro may be most likely since they already offer lossless audio exclusively with the Vision Pro.




Article Link: Kuo: New AirPods to Feature Cameras for Enhanced Spatial Experiences
Could this mean we get some parity with the gesture control UI on VP on iOS? Like throwing a gang sign in the ‘air’ to skip a track? Or tapping your fingers while looking at your iPhone display to interact with an button?
 
Guys it’s an in-ear IR sensor to better determine inner ear shape and therefore better direct Spatial Audio. Let’s not overcomplicate this with the poorly chosen “camera” description (though it certainly can act as one, but I doubt Apple would feel the need to go beyond very low resolution for the intended purpose).



This will likely also be used for heartbeat readings…
 
Great, so now we're gonna be paying extra for expensive features that only 0.1% of us will use
 
  • Like
Reactions: mtrm
I'm sorry you don't understand what IR (Infrared) means. To be fair, the article should have emphasized that the Infrared cameras are looking outward, not inward.

These would barely be considered "cameras" at all by most people -- infrared cameras are sensing only changes in environment, such as turning one's head. For spatial audio, that's important, but its neither "seeing" or "collecting" information about your environment.
I’m sorry you don’t understand what facetious means.
 
Is it too much to ask for Apple to offer high end pro features in pods without the rubber tips!??
As someone who will never use “sealing” type AirPods, unfortunately that seal is what allows for a dramatically increased control over what’s possible to project into the eardrum. Let alone features like noise isolation.
 
  • Like
Reactions: diandi
The vision headset already has all the sensors to do what they want the AirPods to do and if you are using both they obviously have to be synced together so having redundant sensors is dumb. What Ap[ple can't do is come up with a better lie to sell the public on there being more sensors on AirPods that see or listen to what you are doing at all times. What's left is for it to have some AI in it that detects what you are listening to and using parental controls, can muffle audio that is inappropriate. Thats how you slide in more tech that keeps track of everything you do with the device.
 
  • Sad
Reactions: G5isAlive
this development could be very interesting and be useful in a lot of ways.

i think the rumor may hypothesize that the use case scenario is VisionOS related, but i dont think that is anymore than just one of the many ways this could be useful.

wearable tech is one of the most important areas apple is focusing on.
one obvious use that comes to mind immediately is for vision and hearing impaired persons.

this kind of tech is already on my car. when i put my car into reverse to back my car into a parking space, the car's sensors on all 4 sides give me a top down view of how close i am to the cars next to me.

go apple. skating to where the puck will be. not where it is.
 
Kuo, really? AirPods specifically for VisionPro or whatever the next one is called?
Unless there’s a benefit to all other users of AirPods, I highly doubt this
don’t see anything in the report about this being VP-exclusive; can pretty easily see it being advertised as improving Spatial Audio as a whole—they do quite aggressively push it as “the best listening experience” on Apple Music. (could potentially augment the Spatialize Stereo function as well? not like I think that’s worth much…)

as @alexandr pointed out, doesn’t seem like “cameras in AirPods” is the right phrase for this, but it’s an interesting R&D move on Apple’s part. (it’s only in recent months I’ve come to enjoy some Atmos mixes on AM, particularly recent releases where the mixing was done along with the album’s creation.) we’ll see if this comes to fruition given Kuo’s degrading track record.
 
  • Like
Reactions: alexandr
As someone who will never use “sealing” type AirPods, unfortunately that seal is what allows for a dramatically increased control over what’s possible to project into the eardrum. Let alone features like noise isolation.

Yeah I know it’s far from optimal for noise canceling etc but I’d still love the option. I will never buy sealing AirPods, hate it.
 
It's interesting that APPL (and other consumer product companies) are considering using sensors in the infrared spectrum. We have generally been oblivious to NIR light in the past 20-30 years. OG incandescent lights would kick about half their energy as infrared light, but "efficient" LED home lighting only emits light the visible spectrum 😢. It turns out that abundant infrared light is crucial for the performance of our mitochondria (search on NIR light mitochondria for science papers and articles on that topic). You can now buy screw-in LED bulbs that generate much red/NIR light (and less "blue light") -- or bulbs that even adapt their radiation spectrum during the day. With sensors, health-reporting apps could monitor how much red/NIR light you're getting. Sunlight provides abundant red/NIR; "get in the sun" is great advice. At the same time, broader-spectrum indoor lighting -- like we used to have -- is an excellent thing for our overall health.
Great post. I do spend at least 1 hour in direct sunlight. Unfortunately, I do use energy efficient LED home lighting. Always feel the difference without sun or if I can’t get enough sunlight.
 
Yeah I know it’s far from optimal for noise canceling etc but I’d still love the option. I will never buy sealing AirPods, hate it.
Same here, I just don’t get the physics of how you would get it to work at an acceptable level without either sealing outside air, or blocking it like the over-ear Max’s.
 
Great post. I do spend at least 1 hour in direct sunlight. Unfortunately, I do use energy efficient LED home lighting. Always feel the difference without sun or if I can’t get enough sunlight.
Thanks. You can still use narrow-spectrum LEDs for many fixtures; you do want to limit their use after sunset. A few replacement "enlightened" LED bulbs from blockbluelight.com or equivalent can get you set up pretty well. You can also get a small red/NIR panel from a company like Mito Red Light and supplement existing lighting at night. Many of their newer panels have an iOS app and allow you to dial the LED intensity from 1% to 100%. 2 chips per lens allows you to select all-red or all-NIR (or a blend); I highly recommend all-NIR at night to avoid an over-bright light.

Retinas are the mitochondria-canaries in our bodies; they have the highest density of mitochondria of any cells in our bodies. Translating light to electrical signals with low latency is a real energy hog! Our retinas count on red/NIR light to help mitochondria do their job, but helpful frequencies are in short supply after sunset. That's one of the reasons why LED night lights are painful if you use them in the bathroom. A red night light is far easier on the eyes and definitely better than turning on the main bathroom light.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.