Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,546
37,893


Apple's iPhone 16 series next year will adopt a stacked rear camera sensor design across the lineup, following similar adoption in this year's standard iPhone 15 models, according to Apple industry analyst Ming-Chi Kuo.

iPhone-15-Blue-Three-Quarters-Perspective-Feature.jpg

This year's lower-end iPhone 15 and ‌iPhone 15‌ Plus models are expected to feature a 48-megapixel rear camera lens with a stacked CMOS image sensor (CIS) design that can capture more light.

Production yield issues of the new sensor design have likely prevented Apple from adopting it across all iPhone 15 models this year, and while Sony's high-end CIS capacity is expected to remain tight through 2024, Apple has secured most of Sony's orders ahead of time.

According to Kuo, Sony's tight capacity is expected to benefit rival supplier Will Semi, which will obtain more orders for high-end CIS from Chinese smartphone brands as a result.

Rumors suggest the 48-megapixel wide-angle camera used in the iPhone 16 Pro Max will feature an eight-part hybrid lens with two glass elements and six plastic elements, along with improvements for the telephoto and ultra wide camera lenses.

Both the iPhone 16 Pro and the iPhone 16 Pro Max could get periscope telephoto lenses in 2024. In 2023, the iPhone 15 Pro Max will be the only device to get the new camera technology because of size constraints.

Article Link: Kuo: iPhone 16 Pro Models to Adopt Stacked Camera Sensor Design
 
Last edited:
This article is confusing :/

So does that mean that only the lower end models will feature this new and improved stacked sensor this year? That makes no sense.

Shouldn’t the Pro models get the feature first, before it trickles down to the standard models?
 
Well that's been know for a while. It will use custom Exmor T sensor dubbed 903 tweaked for video recording.
 
This article is confusing :/

So does that mean that only the lower end models will feature this new and improved stacked sensor this year? That makes no sense.

Shouldn’t the Pro models get the feature first, before it trickles down to the standard models?
Also didn't understand that part, so this year Apple is adding superior tech to the regular lineup? It has always been the other way around. Very confusing indeed.
 
Nope, it's rumored that both 16 Pro and 16 Pro Max will be slightly longer to accommodate that periscope lens. That's the reason why it will also come to the smaller Pro that year.
Actually, the rumour is that the 16 Pro and Max will have slightly larger screens. Then there is the rumour that bezels will be thinner on iPhone 15. That alone would lead to slightly larger screen sizes. Still, if Apple wanted to have both the 15 Pro and Max have a periscope lens, I'm pretty sure they would have been able to do so. Now they have, like every year, a nice differentiator. And next year the same.
 
So does that mean that only the lower end models will feature this new and improved stacked sensor this year?
The regular models still have smaller sensors than the Pro models. The stacked sensor will be an improvement of the 15 over the 14, but it will presumably still be worse in terms of light capture than the Pro’s larger sensors.

Camera tech is highly sophisticated. There isn’t a simple better-worse axis. It evolves along multiple dimensions with different trade-offs.
 
Last edited:
A lens does not have pixels. A lens is a piece of transparent glass or plastic or other medium and is certainly not pixelated. A lens maybe sit in from of a 48 M Pixel sensor but that is something completely different. I seen this sloppy writing too many times lately.

End Friday rant.

/M
 
I’m upgrading to 15 Pro Max-don’t really care what the 16 Pros look like as they won’t be here for over a year. I’ve never had a Max, used to have the Plus so I’m excited to have the bigger display, longer battery life, and apparently now, better camera on the Max.
 
Since I got my iPhone 13 Pro refurb, I have had zero interest in any new iPhone, 15 or 16. I think the only time I'll be interested is when they make a drastic significant change to its design, like a foldable device.
 
Last edited:
  • Like
Reactions: Pinkyyy 💜🍎
Stacked sensors don't gather more light. The amount of light hitting the sensor is the same. The only way you're going to gather more light is if the aperture, the camera hole, is bigger, or the sensor is also bigger to capture more of the light.

What stacked sensors do is processing the light hitting the sensor faster, thus increasing the readout speeds. That means less rolling shutter. It also allows for more resolution and faster frame rates and also probably faster/more accurate subject recognition. What it doesn't do is directly improve low light performance, but because with all the extra processing capability, computational photography will kick in and will ultimately help getting a better low light output using the phone's HDR/stacking blending capabilities in real time.
 
Last edited:
Off-topic: does anyone know the background used in the render on top of the article?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.