Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
70,609
42,335


Apple is exploring multispectral imaging technology for future iPhone cameras that could improve Visual Intelligence, enhance material detection, and boost image processing, according to a new supply chain rumor out of China.

iphone-17-cameras-zoom.jpg

In a post on Weibo, leaker Digital Chat Station said Apple is currently evaluating components related to multispectral imaging within the supply chain, but cautioned that formal testing has not yet begun, suggesting the technology remains at an exploratory stage.

Multispectral imaging differs from traditional smartphone photography, which relies solely on standard red, green, and blue light. Instead, the technology captures image data across multiple, distinct wavelength bands, which can add sensitivity to near-infrared or other narrow spectral ranges. This could potentially allow cameras to detect information that is largely invisible to conventional sensors.

If adopted in future iPhones, one potential advantage could be improved material and surface differentiation. By analysing how different materials reflect light across wavelengths, the iPhone's camera could more accurately distinguish skin, fabric, vegetation, or reflective surfaces, enabling cleaner subject recognition and more reliable portrait effects.

In addition, multispectral data could also improve image processing overall, especially when shooting in mixed lighting environments. It could also theoretically improve Visual Intelligence and Apple's on-device machine learning, leading to better object recognition, scene understanding, and depth estimation.

However, adding extra spectral sensitivity would likely require more complex sensor designs, which would surely increase costs and potentially have an impact on internal space constraints. This might be why Apple is reportedly still evaluating the technology, rather than actively testing it in prototypes. Either way, it's not something we should expect in an iPhone soon.

In the same Weibo post, Digital Chat Station reiterated that the Main lens on iPhone 18 Pro models will feature a variable aperture, while the telephoto camera will have a larger aperture, but Apple has yet to begin prototyping 200-megapixel cameras for future iPhones.

Article Link: Apple Reportedly Exploring Multispectral Imaging for Future iPhones
 
Multispectral imaging differs from traditional smartphone photography, which relies solely on standard red, green, and blue light. Instead, the technology captures image data across multiple, distinct wavelength bands, which can add sensitivity to near-infrared or other narrow spectral ranges. This could potentially allow cameras to detect information that is largely invisible to conventional sensors.

Article Link: Apple Reportedly Exploring Multispectral Imaging for Future iPhones

So you're saying Apple have appointed Dr. Egon Spengler as a special advisor, to develop an Apple spectral analyzer?

1767787541201.png


Can't wait for Phil Schiller to make a keynote demo return, to show off the new Multi-Planar Kirlian Emanation functionality – maybe with photos of Vigo paintings and the River of Slime. 👻
 
So you're saying Apple have appointed Dr. Egon Spengler as a special advisor, to develop an Apple spectral analyzer?

View attachment 2593804

Can't wait for Phil Schiller to make a keynote demo return, to show off the new Multi-Planar Kirlian Emanation functionality – maybe with photos of Vigo paintings and the River of Slime. 👻
the Vision Pro 2 was also leaked

IMG_3708.jpeg
 
Would like to see this used to emulate the old Kodak's Aerochrome, this was Kodak's iconic false-color infrared film, long since discontinued. People with old DSLR's are having their cameras hacked to capture this bandwidth, which by default is deliberately excluded from the sensors gamut range. A sensor that can capture infrared and output it as an optionB would be one up on DSLR's
 
  • Like
Reactions: UliBaer
I own a full spectrum Fuji IS Pro, which is simply a Fuji S5 Pro with the infrared and ultraviolet filter removed. By putting IR bypass or UV bypass filter on the lens, I can take photos in any spectral range i like. Most standard cameras (also in phones) are sensible to IR light and can be used for IR-shots. Only disadvantage is that in using an external IR bypass filter, the exposure time will be very very long.
So this can be achieved by removing the standard UV/IR filters and take a full spectrum shot, possibly removing the unwanted wavelengths afterwards by processing the shot with internal software.
 
Last edited:
  • Wow
Reactions: Ramchi
Incidentally, I was thinking about this tech yesterday that the light source hitting the sensor vertically, while the horizontal details are not being captured could be the next level in the digital camera space. This might require native camera app to be able to show details like the 360 degree camera captures in Android (I think iOS Photo app doesn’t have this yet natively like Google Photos which is capable of displaying 360 view) with more details, depth etc…But obviously, the camera lens may need to be little protruding to capture enough details in a mobile unless they come up with better solutions
 
Incidentally, I was thinking about this tech yesterday that the light source hitting the sensor vertically, while the horizontal details are not being captured could be the next level in the digital camera space. This might require native camera app to be able to show details like the 360 degree camera captures in Android (I think iOS Photo app doesn’t have this yet natively like Google Photos which is capable of displaying 360 view) with more details, depth etc…But obviously, the camera lens may need to be little protruding to capture enough details in a mobile unless they come up with better solutions
All I know is that if this tech comes to an iPhone the lenses will be poking eyes out.

Then again, Vision Pro sales might improve alongside by doubling as safety goggles… 😂
 
  • Haha
Reactions: Ramchi
I own a full spectrum Fuji IS Pro, which is simply a Fuji S5 Pro with the infrared and ultraviolet filter removed. By putting IR bypass or UV bypass filter on the lens, I can take photos in any spectral range i like. Most standard cameras (also in phones) are sensible to IR light and can be used for IR-shots. Only disadvantage is that in using an external IR bypass filter, the exposure time will be very very long.
So this can be achieved by removing the standard UV/IR filters and take a full spectrum shot, possibly removing the unwanted wavelengths afterwards by processing the shot with internal software.
I think the goal here is to be able to separately receive IR image for the iPhone to use for AI post-processing. If it just receives the fused image (achieved by removing the IR filter), it won't be able to remove IR image with post-processing. It can certainly try, but the result won't be better than just a visible spectrum image.
 
Would like to see this used to emulate the old Kodak's Aerochrome, this was Kodak's iconic false-color infrared film, long since discontinued. People with old DSLR's are having their cameras hacked to capture this bandwidth, which by default is deliberately excluded from the sensors gamut range. A sensor that can capture infrared and output it as an optionB would be one up on DSLR's
If you own a full spectrum or a 550nm converted camera / filter, I have something to read for you:
 
To me, the big interest would come if this ended up being able to detect health-related aspects. If research on multispectral imaging identifies ways of interpreting these images, then it could be huge. However that might require multiple very narrow band filters.

I would like to have built-in polarising filters. Could be quite interesting to take images with a range of angles of polarisation, even just two at 90 degrees for a start, and combine in interesting ways - combined with other image adjustments.
 
I own a full spectrum Fuji IS Pro, which is simply a Fuji S5 Pro with the infrared and ultraviolet filter removed. By putting IR bypass or UV bypass filter on the lens, I can take photos in any spectral range i like. Most standard cameras (also in phones) are sensible to IR light and can be used for IR-shots. Only disadvantage is that in using an external IR bypass filter, the exposure time will be very very long.
So this can be achieved by removing the standard UV/IR filters and take a full spectrum shot, possibly removing the unwanted wavelengths afterwards by processing the shot with internal software.
According to the link you posted an Fuji IS Pro without UV and IR filters is sensitive to wavelength interval 380-1000 nm. To me that seem like very little UV, visible light spectrum is usually regarded as 380-750 nm. So your camera barely touches into UV-A. To capture larger parts of the UV spectrum you need some very special quartz or fluorite lenses. One such lens is Zeiss/Hasselblad 105mm-f4 UV Sonnar (it has high transmission down to 280 nm) It may cost somewhere close to $15000 at an auction. (edit: found one at ebay for $24000)

It is not only enough to have lenses that actually can transmit a large part of the UV spectrum. You also must be aware of how much UV light that is available. Sunlight outside the atmosphere goes down to roughly 200 nm, most below 280 nm is absorbed by the ozon layer and do not reach the surface. Between 280 and 315 most is also absorbed by the atmosphere. It is only UV-A 315-400 nm (overlapping visible spectrum) that reaches the earth surface. If you want more you have to use other UV light sources.
 
Last edited:
  • Like
Reactions: polyphenol
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.