Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well then we disagree on the meaning of the term „digital zoom“. There is nothing „digital“ about cropping.
If the optics don’t change, it’s digital zoom.
by definition, if you’re zooming without changing the optics, it’s digital zoom.

The iPhone 17 Pro’s 8× is just a center crop of the 48MP telephoto , no interpolation, so still digital.
 
Folks, I'm just talking about physical hardware here. Of course Apple has made many changes to their processing pipeline and software since the 14 Pro. There is 1 physical upgrade that is very real: they started using an improved anti reflective coating on the lens of the 16 Pro, so that is a nice physical upgrade for sure but other than that it's physically the same as the 14 Pro main camera https://fstoppers.com/landscapes/apples-secret-upgrade-iphone-16-pro-yields-incredible-results-682936#:~:text=I won't dwell too much on these tests because I don't think the main sensor or its technology has changed much (if at all) since the iPhone 14 Pro

To be fair, PetaPixel says the faster readout speed does give the 16 Pro one benefit: slightly sharper/clearer photos for moving objects (or if camera is the one moving)

But what I'm talking about here is for more serious photography using ProRes, ProRAW or Halide Process Zero where computational processing isn't there to help, there is no meaningful improvements compared to the 14 Pro.


Bottom line, that main camera is over due for a significant upgrade (remember, this is a "Pro" iPhone we're talking about folks). I'm really hoping they go with a bigger sensor that uses Sony's new 2 layer transistor pixel tech.
 
Last edited:
They don’t have to and take the ready when it’s ready approach. They know it’s still going to sell gangbusters regardless without it.
 
What’s more concerning is that the claimed 8x optical zoom is essentially a lie.

The 17 Pro does not have 8x optical zoom.

It only offers 4x optical zoom, and the image is then digitally cropped to simulate 8x.

That’s actually less optical zoom than the iPhone 16 Pro (5x).
Do you have an analysis on the new tetraprism? If not, how do you know there isn’t an 8x light path in there?
 
I'm really hoping they go with a bigger sensor that uses Sony's new 2 layer transistor pixel tech.

It‘s basically just physics. The largest sensors in use in smartphones today are 1“, the main iPhone sensor is just slightly smaller. Sensors larger than 1“ would also necessitate larger lenses, i.e. an even more pronounced camera bump/plateau. At some point, the phone proprtions (and weight balance) are just out of whack.
 
  • Like
Reactions: Macintosh IIcx
The 14 Pro and 17 Pro main camera has the same sensor... Look, I get it, they can't be upgrading camera hardware every year or 2, especially if it's a good sensor to begin with. And the main camera in the 14 Pro was indeed really good for its time, but to be using that same hardware for the 14 Pro, 15 Pro, 16 Pro and 17 Pro is just bananas. This is not one of those silly "specs for specs sake" things, the main camera is the camera you're going to be using the majority of the time, the main camera deserved the priority for the 17 Pro, not the telephoto. They should have kept using the same telephoto and upgraded the main camera hardware instead.

Apple has been using Sony sensors for years and since 2021 Sony has been talking about their new 2 layer transistor pixel sensor tech, this sensor has been ready for production for a while now, it's already shipped in several competing phones. The 17 Pro main camera should have had this sensor! This sensor might even allow them to do something like a video night mode.

I'm just sad about this and I'm honestly shocked the 17 Pro main camera hasn't been upgraded since the 14 Pro. It's an incredible phone with that one downside.
This is blatantly false. iPhone 14 and 15 use the Sony IMX803. iPhone 16 and (presumably 17) use the IMX903. They seem to be updating main sensors every 2 years or so. And each year a different camera gets the spotlight, last year it was the 48MP Ultrawide and this year it's the 48MP Telephoto.
 
  • Like
Reactions: Chidoro
This is blatantly false. iPhone 14 and 15 use the Sony IMX803. iPhone 16 and (presumably 17) use the IMX903. They seem to be updating main sensors every 2 years or so. And each year a different camera gets the spotlight, last year it was the 48MP Ultrawide and this year it's the 48MP Telephoto.
Go read about the difference between those 2 sensors, they are physically identical but the 16 Pro has a faster readout speed, there is speculation about what changed to make the readout speed faster but that is the only difference (and the lens on the 16 Pro has a better anti reflective coating than the 15 Pro).
 
Not only a sensor upgrade, but an optics upgrade will be almost better. For example, night mode on the 15 & 16 & most likely the 17 pro models will give the same results. Every time I've taken a photo of the night sky, the center of the image is brighter than the edges due to the lens not covering (vignettes) the sensor evenly and is difficult to edit out. The optics need to be better, have better covering power to the sensor. Making the optics larger with a a new design is what I'd like to see on the new cameras of future iPhones. It almost feels like they are afraid to make a fantastic camera because it may hurt the DSLR market. I have a 16 pro max and seeing the exact same camera on the 17 pro max is one reason I did not update my phone. I'll get the same results in a different body / frame. The optics on my galaxy s25 ultra main camera does not have the center hot spot as my iPhone does. Nor does the 1x camera on the S25 Ultra have ghosting when taking photos or video with bright lights, no floating reflections. I really like the iPhone, im just done with waiting for the cameras to be dramatically updated. This is why I've moved to Samsung. I've moked Android for years, thought is was useless, but after having one, I love it.

I might upgrade my 16 pro max to the 17 pro max in a few months, but it will be just for the design. If not I'll have to wait to see if Apple improves the camera optics and sensors. I've had the iPhone since 2009 & I've always upgraded to the new iPhone each year. This will be the first year I skipped due to the lack of camera innovation. But I think that's Apples marketing, just upgrade the phone just enough to sell more and people buy the new iPhone just to have it. I always upgrade just for the camera, but since apple has not updated the 1x & .5x optics in three years, I've lost interest. I can't control the iPhone cameras like I can on the S25 Ultra, even third party apps like Halide & ProCamera by Moment offer some control, it's still not the same. If I change to manual mode on Halide, im stuck with 12mp photos only, not 48mp. Same with Moment, make a change to the ISO, resolution drops to 12mp from 48. Also apple limits third party app developers to 1 second exposure time which is pathetic. I can expose for a true 30" exposure time and get awsome results. The third party apps have to stack 1 second exposures on iPhone to get a false duration exposure. As a pro photographer, that's useless to me. Sure I can use my Sony A7RV camera every day and carry medium sized lenses around, but i dont want to di that unlessni have a paid photo shoot. Maybe one day apple will come up with an app like Expert Raw like Samsung uses, that would be great because it works so well, just like a DSLR, full auto & manual control.
 
Last edited:
  • Like
Reactions: bryanrs
I’ve been a little disappointed so far with the 17 Pro Max camera. Like others have stated the main camera is unchanged quality wise. What’s been really perplexing me is the apparent neutering of the telephoto lately. I’ve been comparing with my 16 PM the last few days and ever since iOS 26 it’s been far more hesitant to actually select the physical telephoto lens with the Apple Camera app. I have to force it by blocking the other lenses briefly with my finger. Halide and Lightroom default to it still however so that’s a relief. Another oddity is that 4x on the 17PM will produce 48MP images however any zoom beyond that goes down to 12MP so there is apparent pixel binning happening once you zoom at all. Images beat out the 16PM most of the time of course but the metadata reveals a ton of AI image processing going on with the 17PM “optical quality zoom” cropping as well. I don’t hold many expectations for groundbreaking changes with Apple products anymore but this phone truly has been a letdown so far
 
  • Like
Reactions: ToddH
What’s more concerning is that the claimed 8x optical zoom is essentially a lie.

The 17 Pro does not have 8x optical zoom.

It only offers 4x optical zoom, and the image is then digitally cropped to simulate 8x.

That’s actually less optical zoom than the iPhone 16 Pro (5x).
But a much better focal length for portraits.
 
Same size since 14 but not same sensor. Also Apple isn't unique in using sensors across generations

Google used the same sensor in the Pixel 3, 4, and 5

Sammie used the same one for S22, S23, S24.
 
  • Like
Reactions: miemo
Nearly every camera — film or digital — does some cropping. It’s unavoidable, since a lens will produce a circular image and most sensors are rectangular.
Not quite. yes, camera lenses make a circular image and sensors are rectangular, but that’s just how the sensor captures the image, it’s not the same as digital cropping. Normal cameras are designed so the sensor fits within the lens’s usable image circle.

The iPhone 17 Pro’s 8x zoom uses the 4x optical lens and 48MP sensor then crops it into the center to simulate 8x magnification. That means the final image isn’t full 48MP and ends up around 12MP.
 
But a much better focal length for portraits.
Yes, but it doesn’t change the fact that beyond 4x it’s just digital cropping. In low light and at zoom levels above 4x, the 17 Pro images lose sharpness and show more noise compared to the 16 Pro’s true 5x telephoto.
 
Not quite. yes, camera lenses make a circular image and sensors are rectangular, but that’s just how the sensor captures the image, it’s not the same as digital cropping.

I didn’t say it was “digital cropping.” I said it was cropping. Stop putting words into my mouth.


Normal cameras are designed so the sensor fits within the lens’s usable image circle.

Correct. *Within* the image circle. A subset of the circle. i.e., cropping. Thus, “digital zoom” by your own unique personal definition — “cropping is a form of digital zoom.”

You can’t have it both ways. If you want to apply that definition to the iPhone 17Pro, you have to apply it to *all* cameras.

Or, you can stop being stubborn and accept the *standard* definition that is used by the rest of the world.


The iPhone 17 Pro’s 8x zoom uses the 4x optical lens and 48MP sensor then crops it into the center to simulate 8x magnification. That means the final image isn’t full 48MP and ends up around 12MP.

It’s not “simulating” anything. You’re determined to twist words to fit your failed argument.
 
I didn’t say it was “digital cropping.” I said it was cropping. Stop putting words into my mouth.




Correct. *Within* the image circle. A subset of the circle. i.e., cropping. Thus, “digital zoom” by your own unique personal definition — “cropping is a form of digital zoom.”

You can’t have it both ways. If you want to apply that definition to the iPhone 17Pro, you have to apply it to *all* cameras.

Or, you can stop being stubborn and accept the *standard* definition that is used by the rest of the world.




It’s not “simulating” anything. You’re determined to twist words to fit your failed argument.
Well he's right. It is a crop from the 48mp sensor to give an equivalent 8x. Just like small sensor DSLR cameras that use an APSC sensor that gives a crop magnification to lenses attached to it- the 1x camera on the iphone crops to 12mp to give us the 2x camera equivalent via pixel binning. The final output is 12mp. So why is that a failed argument?
 
Last edited:
Well he's right. It is a crop from the 48mp sensor to give an equivalent 8x. Just like small sensor DSLR cameras that use an APSC sensor that gives a crop magnification to lenses attached to it- the 1x camera on the iphone crops to 12mp to give us the 2x camera equivalent via pixel binning. The final output is 12mp. So why is that a failed argument?
Because words have meanings. You can’t just make them mean whatever you want them to mean, no matter what Humpty Dumpty said.

Digital zoom is more than just cropping. Even Wikipedia will tell you that much.

“Digital zoom is a method of decreasing the precise angle of view of a digital photograph or video image. It is accomplished by cropping an image down to an area with the same aspect ratio as the original, and scaling the image up to the dimensions or the original.” [emphasis added]
 
Because words have meanings. You can’t just make them mean whatever you want them to mean, no matter what Humpty Dumpty said.

Digital zoom is more than just cropping. Even Wikipedia will tell you that much.

“Digital zoom is a method of decreasing the precise angle of view of a digital photograph or video image. It is accomplished by cropping an image down to an area with the same aspect ratio as the original, and scaling the image up to the dimensions or the original.” [emphasis added]
Well, my comments had nothing to do with digital zoom, just crop factor. And nobody's putting words in your mouth unless they're feeding you alphabet Soup.
 
It’s not though. It is the same physical size as the 14/15 pro sensor. But last year the sensor was upgraded to a newer version that reads data out much faster. The newer sensor allows for the phone to keep pumping out 24 MP images even if you are rapidly firing the shutter. The 15 Pro would drop down 12 MP after 3-4 rapid shots because it couldn’t write data off the sensor fast enough to do the 24 MP processing.

This is an error in their writeup.
 
  • Like
Reactions: Moreplease
It’s not though. It is the same physical size as the 14/15 pro sensor. But last year the sensor was upgraded to a newer version that reads data out much faster. The newer sensor allows for the phone to keep pumping out 24 MP images even if you are rapidly firing the shutter. The 15 Pro would drop down 12 MP after 3-4 rapid shots because it couldn’t write data off the sensor fast enough to do the 24 MP processing.

This is an error in their writeup.
I'm sure what they meant was that the optics for the cameras over the past few years hasn't changed. F/1.7 24mm. Apple needs to improve on the optics so the light hitting the sensor is more even instead of vignetting at the edges while brighter in the center. This is hugely noticeable when using night mode and photographing the night sky. It's difficult to edit out. Currently the iPhone optics still show a lot of ghosting or lens flare with specular highlights in photos or videos. My Samsung S25 Ultra doesn't have this issue as badly, it's barely visible. Somehow samsung has mastered the multi-coatings on the optics. I sure hope next year's iPhone is better by not using the same cameras. Probably will but with different colors.
 
  • Like
Reactions: miemo
It’s not though. It is the same physical size as the 14/15 pro sensor. But last year the sensor was upgraded to a newer version that reads data out much faster. The newer sensor allows for the phone to keep pumping out 24 MP images even if you are rapidly firing the shutter. The 15 Pro would drop down 12 MP after 3-4 rapid shots because it couldn’t write data off the sensor fast enough to do the 24 MP processing.

This is an error in their writeup.
so from what I can make out, the only change from the sensor in the 15 Pro to 16 Pro is maybe the ISP/DSP and also it's designed to work with the new SoC, which enables the faster readout. Which is awesome! I'm not downplaying the faster readout, people say it's helped a little bit! But it's the same bro, people aren't lying about this https://forums.macrumors.com/threads/sony-imx903-sensor-on-16-pro.2438445/
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.