Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TonyC28

macrumors 68030
Original poster
Aug 15, 2009
2,944
7,392
USA
I haven’t been all that impressed with my iPhone 16 Pro camera since getting it. I’m hoping I just have some settings wrong. Skin always looks blurry, like clay if that makes sense. And the pics just don’t look as sharp as with my iPhone 13 Pro. So I did this test. Pictures taken of the same object, in the same room, with the same lighting. Phone in a holder and used the timer to eliminate any shake. Taken from the same distance.

iPhone 16 Pro

IMG_7819.jpeg


iPhone 12

IMG_0170.jpeg


At a glance, you can tell the iPhone 12 pic is sharper.

Cropping to a much closer view, you can really see the difference.

iPhone 16 Pro

IMG_7819.jpeg


iPhone 12

IMG_0170.jpeg


Stats after cropping:

IMG_7827.jpeg
IMG_7826.jpeg


The ISO is way higher with the 16. I don’t know why.

So is there a setting I’m missing somewhere? Do I maybe have a problem with my phone?
 
You are using Ultra-Wide camera (which is lower quality both lens and sensor) on the i16pro and main Wide sensor on the i12 - the main lens will always be better than either Ultra-Wide or Tele...
 
The ISO is way higher with the 16. I don’t know why.

Because the aperture is significantly smaller on the ultra-wide (f/2.2) than on the 12's wide (f/1.6). Less light getting to the sensor means it needs to up the ISO significantly to prevent the photo from being under exposed. Higher ISO means more noise means more of the usual iPhone heavy-handed post-processing to try to remove noise, which also kills detail. The difference in your two photos is perfectly understandable given this. Add on top of it that the ultra wide in the 16 Pro crams 48MP into a tiny sensor means it's going to really struggle to perform without enough light.

So why is one on the wide lens and one the ultra-wide? You're getting too close to your subject so the 16 Pro is automatically jumping to macro mode which uses the ultra wide camera. Your iPhone 12 had no macro mode. Eventually, as you got closer to your subject, it would be unable to focus at all and the photo would go fully blurry, while the 16 Pro would still be able to focus as you got right up against your object. This is both a benefit, and something to be aware of: The focal distance of each physical camera (ultra-wide, wide, and telephoto) is going to be different, and different between each iPhone model. Focal distance is how the camera app determines which physical camera to use, a process that is (unfortunately) completely opaque. It's something you need to be aware of when framing a photo.

When you hit one of the magnifiers on the camera app (0.5x, 1x, 2x, and 5x) you're choosing a magnification relative to 100%, not necessarily the camera that the app will use to take the picture. The app chooses which camera to use based on focal length and available light.

Computational photography is magic, but physics is physics, and photography is about light.
 
Last edited:
Thanks guys. Not sure how I missed the lens difference.
The further minimum focusing distance on the newer iPhones is quite annoying, but the trade-offs can be worth it. When I’m taking a picture of something and I want to make sure I’m using the main lens, I usually switch it to 2x before taking the picture. It makes sure I’m back far enough. This is mainly when I’m taking pictures of small objects or receipts when I don’t want to use macro.
 
Apple:
  • Pro camera system
  • 48MP Fusion: 24 mm, ƒ/1.78 aperture, second‑generation sensor‑shift optical image stabilization, 100% Focus Pixels, support for super‑high‑resolution photos (24MP and 48MP)
  • Also enables 12MP 2x Telephoto: 48 mm, ƒ/1.78 aperture, second generation sensor‑shift optical image stabilization, 100% Focus Pixels
  • 48MP Ultra Wide: 13 mm, ƒ/2.2 aperture and 120° field of view, Hybrid Focus Pixels, super-high-resolution photos (48MP)
  • 12MP 5x Telephoto: 120 mm, ƒ/2.8 aperture and 20° field of view, 100% Focus Pixels, seven-element lens, 3D sensor-shift optical image stabilization and autofocus, tetraprism design
  • 5x optical zoom in, 2x optical zoom out; 10x optical zoom range
Then when I tap on the 1X I see 24mm, 28mm, and 35mm. So what are the actual physical focal lengths of the three lenses?
 
Apple:
  • Pro camera system
  • 48MP Fusion: 24 mm, ƒ/1.78 aperture, second‑generation sensor‑shift optical image stabilization, 100% Focus Pixels, support for super‑high‑resolution photos (24MP and 48MP)
  • Also enables 12MP 2x Telephoto: 48 mm, ƒ/1.78 aperture, second generation sensor‑shift optical image stabilization, 100% Focus Pixels
  • 48MP Ultra Wide: 13 mm, ƒ/2.2 aperture and 120° field of view, Hybrid Focus Pixels, super-high-resolution photos (48MP)
  • 12MP 5x Telephoto: 120 mm, ƒ/2.8 aperture and 20° field of view, 100% Focus Pixels, seven-element lens, 3D sensor-shift optical image stabilization and autofocus, tetraprism design
  • 5x optical zoom in, 2x optical zoom out; 10x optical zoom range
Then when I tap on the 1X I see 24mm, 28mm, and 35mm. So what are the actual physical focal lengths of the three lenses?
13, 24, and 120. Everything else is a digital crop.
 
  • Like
Reactions: cateye
I think I’m getting it now. So the 2x telephoto at 48mm is not really telephoto, it’s a crop zoom?
Yes it’s just cropping in on the sensor. The iPhone has no variable zooms, just fixed prime lenses and the ability to crop at various simulated focal lengths.
 
  • Like
Reactions: cateye
I would like to understand this better before I buy an iPhone 16. I understand that two different camera modes were used in the original comparison, which explains the differing results. What concerns me is that, if I’m understanding correctly, the iphone 16 chose to produce a worse photo than it could have. It sounds like a better result could have been obtained if the iPhone 16 had chosen to do the same thing the iPhone 12 did. So even if the 16 is technically more capable than the 12, it might often produce worse results in macro mode because it’s using its hardware poorly. Is that correct?
 
  • Like
Reactions: whrage
You're correct, but only technically. The iPhone 16 has more camera features than the iPhone 12 and as a result interacts with the subject you're trying to photograph in different ways than the iPhone 12 would. Yes, there's a point where the iPhone 16 would have worse performance than the iPhone 12 because it's switching to a less performant camera (the ultra wide) to engage macro mode. If the OP had continued to get closer to the phone keypad, eventually the iPhone 12 would've taken the worse photo because it would no longer be able to focus on an object that close, while the iPhone 16 would still be able to produce a usable photo.

Light, focus, focal length, focus distance, aperture, digital zoom, photographic styles... It's up to the person taking the photograph to understand how camera features affect end results. This is true of any camera, not just an iPhone. If I use my DSLR with my high-end 50mm prime lens to take a picture of a flower, and take the same picture with my somewhat junky 60mm macro lens, the photo taken with the 50mm, at certain focal distances, is going to look far better. But as I get closer to the flower, that 60mm macro lens will shine, and the 50mm prime, despite being an expensive, high-quality lens, will fail me. The same dynamic is at work for the OP.
 
Last edited:
You're welcome! I will note that while I'm laying a lot of this at the feet of the person taking the photo, the iPhone's built-in camera app does not make this easy since it does so much opaquely, in the background. For example, there's no way to know exactly which of the three cameras you're using (regardless of the zoom setting) without checking the photo's metadata after you've taken it. Is it a telephoto shot taken at 5x with the 5x camera? Or is it a 5x digital zoom taken with the main wide camera? No idea! Both look the same as you frame the scene on the screen, but the quality of the resulting photos will be very different. Maddening.

Complaints aside, for the most part, the iPhone decides correctly what intent is and delivers a great photo. Sometimes, it does not. The more aware you are of your own intent, the more likely that the Camera.app's automatic behavior is going to feel frustrating and limiting, and that's the time to pursue something like Halide, etc. that gives you far more direct control.
 
  • Like
Reactions: whrage
You're welcome! I will note that while I'm laying a lot of this at the feet of the person taking the photo, the iPhone's built-in camera app does not make this easy since it does so much opaquely, in the background. For example, there's no way to know exactly which of the three cameras you're using (regardless of the zoom setting) without checking the photo's metadata after you've taken it. Is it a telephoto shot taken at 5x with the 5x camera? Or is it a 5x digital zoom taken with the main wide camera? No idea! Both look the same as you frame the scene on the screen, but the quality of the resulting photos will be very different. Maddening.
I thought that one could choose which camera was used with the Camera Control.
The heading "Settings available with the Camera Control" has this item:
Cameras: Adjust the field of view by changing the camera.
I interpret that as allowing one to switch cameras. Maybe it's opaque because one actually adjusts the field of view, and the phone then switches between cameras.

I don't have an iPhone 16 to test this on, so I can't confirm what it does.

I also vaguely recall reading something about needing to use Settings > Camera Control first, in order to configure it so Camera Control controlled the camera. If I recall correctly, the choice was between "Zoom" and "Cameras", but I wouldn't swear to it.

EDIT
Here's the thread I recalled:
 
Last edited:
I interpret that as allowing one to switch cameras. Maybe it's opaque because one actually adjusts the field of view, and the phone then switches between cameras.
This.

Here's a controlled experiment to try that forces the same issue the OP experienced: Stand no more than 4 feet away from an object. Select the 5x lens/magnification through whatever method you prefer, and focus on that object. Take a picture. Now move at least 5 feet away, and do the same thing.

Look at the metadata of both photos. The photo taken at 4 feet or less will most likely be a 5x digital zoom taken with the main (wide) camera. The photo taken at greater than 5 feet will use the 5x telephoto camera, with no digital zoom, and may produce a sharper photo as a result.

The minimum distance that the 5x telephoto camera can focus is about 4.5 feet (for the 3x camera in older pros, it's about 3 feet). For the closer photo, the camera app has two choices: Stick to only using the 5x camera, resulting in a blurry photo since it won't be able to focus on an object that close. Or, do a digital zoom-and-crop with the camera that can focus that close (the primary Wide camera), resulting in a photo that is usable, but may be quality-compromised due to the digital zoom.

For most cases, the camera is choosing correctly: Usable over absolute. I would rather get a photo I can use than to have missed the shot entirely. However, the problem is that the only way to know this decision has been made on my behalf is to check the metadata. That is the problem. Ideally, the camera app should communicate exactly which camera is being used so you, the photographer, can adjust your framing or position to get the result you intend rather than an enforced compromise. Like many things unique to the iPhone Pro's camera setup that are arbitrary and enforced purely through software (like ProRAW support), it strikes me as odd that Apple doesn't allow the camera app on the Pros to "lock" which camera is being used if the user so desires.
 
Last edited:
  • Like
Reactions: whrage
This.

Here's a controlled experiment to try that forces the same issue the OP experienced: Stand no more than 4 feet away from an object. Select the 5x lens/magnification through whatever method you prefer, and focus on that object. Take a picture. Now move at least 5 feet away, and do the same thing.

Look at the metadata of both photos. The photo taken at 4 feet or less will most likely be a 5x digital zoom taken with the main (wide) camera. The photo taken at greater than 5 feet will use the 5x telephoto camera, with no digital zoom, and may produce a sharper photo as a result.

The minimum distance that the 5x telephoto camera can focus is about 4.5 feet (for the 3x camera in older pros, it's about 3 feet). For the closer photo, the camera app has two choices: Stick to only using the 5x camera, resulting in a blurry photo since it won't be able to focus on an object that close. Or, do a digital zoom-and-crop with the camera that can focus that close (the primary Wide camera), resulting in a photo that is usable, but may be quality-compromised due to the digital zoom.

For most cases, the camera is choosing correctly: Usable over absolute. I would rather get a photo I can use than to have missed the shot entirely. However, the problem is that the only way to know this decision has been made on my behalf is to check the metadata. That is the problem. Ideally, the camera app should communicate exactly which camera is being used so you, the photographer, can adjust your framing or position to get the result you intend rather than an enforced compromise. Like many things unique to the iPhone Pro's camera setup that are arbitrary and enforced purely through software (like ProRAW support), it strikes me as odd that Apple doesn't allow the camera app on the Pros to "lock" which camera is being used if the user so desires.
I just tried your experiment and you are absolutely correct. Basically by tapping .5, 1, 2, or 5 I am choosing the desired "zoom" amount, the phone then decides which lens to use to achieve it. You are quite the expert. Thank you!
 
  • Love
  • Like
Reactions: whrage and cateye
Oh gosh, you're very welcome. I'm glad my over-wordy posts were useful. I have a background in traditional photography, so in many ways, the iPhone approach to taking a photo is counter to how I think about the relationship between me, camera, and subject. But I recognize the opportunity that smartphones offer (as they say, the best camera is the one you have with you) so I've made it a point to try and understand what's going on. I get the priorities Apple has set, there is logic to it for most users, but the moment you want to wrestle control away from the genie in the box, it gets pretty messy.
 
  • Like
Reactions: TonyC28
I believe there are other apps, like ProCam that will actually let you choose which camera to use.
 
  • Like
Reactions: cateye
They are both jpeg? That's out of camera? Usually they would be HEIF as default?
Why is one 3mp and one 1.6mp when they are almost identical size?
That's a big enough difference in file size to mean the 16 is adding a lot more compression and that's enough to remove some of the detail. Not enough to make it jaggy but it's enough to make the difference you see.
What you can do about it ? Try shooting both in RAW, processing them the same and compressing them the same to see if they look the same. If not then it's something else.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.