Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It is not enough to just have a lens that transmits some parts of UV and IR spectrum. Then you will just get higher levels at the blue and red sensors. If you want some "Multispectral" functions you have to also add sensors only sensible to UV or IR light, that is a bit easier said than done. The more narrow spectral bands you capture the less light sensitivity your camera will have.
 
Nanolenz tech finally about to be implemented? edit: https://metalenz.com/our-technology/ is up. You can see some cool videos on the tech there.



This stuff is an actual lens revolution... Insane stuff... No more bump, behind the screen front camera finally etc.

I took a deep dive into the tech when I heard of it first around 2024-25. Honestly, I am quite sure this is what Apple was talking about when they were saying they were "waiting for the tech" for a thin and light AVP.



2026/27 is the launch window they stated in 2025. Very cool tech.
 
Last edited:
Good to see that iPhone cameras are improving! Yearly improvements are very good. Think it can be a part of the 20th anniversary iPhone model.
 
  • Like
Reactions: mganu
iPhone doesn't use multispectral sensors? That explains their inaccurate, warm white balance.

On the other hand, the Pixels use these sensors to do white balance segmentation (White balance for different parts of the image). But sometimes it does it to the point that it neutralizes vibrancy. I hope iPhone wouldn't go such a route.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.