🤔 I don’t understand this article..
Apple's iPhone 16 series next year will adopt a stacked rear camera sensor design across the lineup, following similar adoption in this year's standard iPhone 15 models, according to Apple industry analyst Ming-Chi Kuo.
![]()
This year's lower-end iPhone 15 and iPhone 15 Plus models are expected to feature a 48-megapixel rear camera lens with a stacked CMOS image sensor (CIS) design that can capture more light.
Production yield issues of the new sensor design have likely prevented Apple from adopting it across all iPhone 15 models this year, and while Sony's high-end CIS capacity is expected to remain tight through 2024, Apple has secured most of Sony's orders ahead of time.
According to Kuo, Sony's tight capacity is expected to benefit rival supplier Will Semi, which will obtain more orders for high-end CIS from Chinese smartphone brands as a result.
Rumors suggest the 48-megapixel wide-angle camera used in the iPhone 16 Pro Max will feature an eight-part hybrid lens with two glass elements and six plastic elements, along with improvements for the telephoto and ultra wide camera lenses.
Both the iPhone 16 Pro and the iPhone 16 Pro Max could get periscope telephoto lenses in 2024. In 2023, the iPhone 15 Pro Max will be the only device to get the new camera technology because of size constraints.
Article Link: Kuo: iPhone 16 Pro Models to Adopt Stacked Camera Sensor Design
Has anybody done the experiment comparing the two? Not doubting your statement - just asking for information.48mp down samples to 12mp with less noise and better details than a native 12mp photo.
Thank you. That helps me understand what they are talking about.
Stacked sensors don't gather more light. The amount of light hitting the sensor is the same. The only way you're going to gather more light is if the aperture, the camera hole, is bigger, or the sensor is also bigger to capture more of the light.
What stacked sensors do is processing the light hitting the sensor faster, thus increasing the readout speeds. That means less rolling shutter. It also allows for more resolution and faster frame rates and also probably faster/more accurate subject recognition. What it doesn't do is directly improve low light performance, but because with all the extra processing capability, computational photography will kick in and will ultimately help getting a better low light output using the phone's HDR/stacking blending capabilities in real time.
I think the rumor is totally wrong. It doesn't even make sense if only iPhon 15 and 15 Plus get stacked sensor because even now, stacked sensors are available only for flagship FF and APSC cameras. Yes, the stacked sensor is expensive one. It allows faster reading speed and better performance which you can use an electronic shutter as a main shutter without a mechanical shutter. It is another revolutionary tech for cameras but like I said, the stacked sensor IS expensive. Also, stacked sensor does NOT gather more light, it just work way faster than normal sensors. I have no idea why Kuo said that but he is def not a camera professional.
Low-end iPhone with stacked sensors is already nonsense.