I have switched to iPhone 14 Pro from iPhone 12 for almost a month, iPhone 14 Pro's new camera system is definitely superior than iPhone 12 in terms of image quality.
However, the camera system algorithm is getting too "smart" and auto. Apple doesn't even allow us to turn off Scene Detection and Smart HDR in iPhone 14 Pro anymore, which I can turn it off in iPhone 12.
Yesterday I was trying to use the iPhone 14 Pro 77mm telephoto camera to capture a butterfly, but while I was framing the butterfly, I found out the camera kept on switching between the 24mm and 77mm lenses. So I decided to capture the butterfly using both lenses to see the difference.
As you can see the EXIF below on the right, the iPhone 14 Pro used the centre 12MP (48mm) area of the 48MP sensor, and digital zoomed it to mimic a 78mm equivalent.
You can instantly see that the digital cropped 78mm has a more shallow depth of field than the 77mm because of the larger aperture. But obviously the real telephoto lens has more detail than the cropped one.
And one more weird thing, when I zoomed in to see the butterfly wings, the cropped version has some weird artifacts. Maybe it's deep fusion?
I know Apple automatically switches to macro when your camera is closer to the subject, but I don't understand why Apple decided to switch between cameras in telephoto too.
Just my two cents, I think Apple is getting confident that the 48MP sensor and its computational photography algorithm is so good that digital cropped and zoomed photo can compete with a standalone telephoto sensor and lens. Because most people who use iPhone are posting photos on social media, first impression is king and details don't matter. It makes me believe Apple will adopt 100MP sensor and crops/bins the hell out of it for telephoto for a 12MP output, and kills the actual telephoto lens to make space for the rumoured periscope lens on future iPhone. The algorithm to auto switch between main camera and telephoto camera is paving the way for an more ambitious future of computational photography.
However, the camera system algorithm is getting too "smart" and auto. Apple doesn't even allow us to turn off Scene Detection and Smart HDR in iPhone 14 Pro anymore, which I can turn it off in iPhone 12.
Yesterday I was trying to use the iPhone 14 Pro 77mm telephoto camera to capture a butterfly, but while I was framing the butterfly, I found out the camera kept on switching between the 24mm and 77mm lenses. So I decided to capture the butterfly using both lenses to see the difference.
As you can see the EXIF below on the right, the iPhone 14 Pro used the centre 12MP (48mm) area of the 48MP sensor, and digital zoomed it to mimic a 78mm equivalent.
You can instantly see that the digital cropped 78mm has a more shallow depth of field than the 77mm because of the larger aperture. But obviously the real telephoto lens has more detail than the cropped one.
And one more weird thing, when I zoomed in to see the butterfly wings, the cropped version has some weird artifacts. Maybe it's deep fusion?
I know Apple automatically switches to macro when your camera is closer to the subject, but I don't understand why Apple decided to switch between cameras in telephoto too.
Just my two cents, I think Apple is getting confident that the 48MP sensor and its computational photography algorithm is so good that digital cropped and zoomed photo can compete with a standalone telephoto sensor and lens. Because most people who use iPhone are posting photos on social media, first impression is king and details don't matter. It makes me believe Apple will adopt 100MP sensor and crops/bins the hell out of it for telephoto for a 12MP output, and kills the actual telephoto lens to make space for the rumoured periscope lens on future iPhone. The algorithm to auto switch between main camera and telephoto camera is paving the way for an more ambitious future of computational photography.