Personally, I'm less concerned about a lack of incremental features and more concerned about a lack of incremental
fixes, particularly with the cameras.
Tom's Hardware's review,
MKBHD's review, and
the Verge's review covers most of these:
- In the iPhone 14 Pro, darker skin tones are still significantly overexposed.
- In the iPhone 14 Pro, blue / white "orbs" of reflections still exist in low-light video recordings.
- In the iPhone 14 Pro, the "3x telephoto" is seemingly still sometimes a crop of the 1x wide (and is always a crop at 2x).
- In the iPhone 14 Pro, all the lenses still incorrectly bias to a warmer white balance than in reality.
- In the iPhone 14 Pro, colors are still crushed in low-light photography.
- In the iPhone 14 Pro, lit areas are still sometimes overexposed in low-light photography.
- In the iPhone 14 Pro, strands of hair are still incorrectly clipped / removed in Portrait Mode.
I don't need any big change or big feature, but fix the multi-generational bugs & problems, Apple. So many basic, long-running problems on these $1000+ "Pro" phones that heavily focus on their camera upgrades. Most of these
should be able to be fixed in software, but Apple loves hardware-locking basic software features (like animations in the Weather app).
I'd consider upgrading if this Photonic Engine actually had significantly increased across-the-board polish. I guess better to wait for Photonic Engine V2, if these issues concern you.
You've hit the nail on the head. All of the problems you've listed have been around for a long time. Every year Apple will update the cameras and they get slightly better in some scenarios but the changes between generations are not enough to really say "wow, shooting under condition X got way better" or "wow the detail is a lot better this year." Even when comparing iPhones from 4 years ago some of the telltale iPhone weak points are still here. When Apple were babbling on about the Photonic Engine's "new rendering pipeline" during the keynote with ZERO side by side comparisons my initial gut instinct was "uh oh, this is pure marketing speech and they haven't improved upon the long standing iPhone camera issues." Based on the review samples we've got so far looks like I might be right, I have to test it out myself but the regular binned images look identical to the 13 Pro if not a bit more washed out. When Deep Fusion came out the improvements were noticeable enough to warrant a side by side comparison during the keynote.
The argument from many is always something along the lines of "well expecting major differences year over year is ridiculous, tech has plateaued and people don't upgrade every year anyway." Ok, I'll grant you that people don't upgrade every year and the rate of new features has slowed down: why does that excuse zero year over year improvements on long standing issues like the stuff you mentioned when the other two big players, Samsung and Google's Pixel, do manage to make significant year over year improvements in their respective departments? Some off the dome examples:
- Pixel 6 upgraded to a 50MP sensor vs. the Pixel 5. Detail for regular binned JPEG photos became significantly better when compared to the previous year's Pixel 5 and even this year's iPhone 14 Pro. Apple have upgraded their sensor pixel count and based on the ProRAW samples it can clearly pick up on more detail so why is the oil painting effect still very noticeable? I thought the oil painting sharpening effect was to make up for lack of sensor resolution?
- Pixel 6 introduced some of the best color science in the smartphone world for rendering brown and black skin and all manner of color in general, the iPhone is seriously behind on this. Apple have had years to fix it but they don't. If they fix it next year it will only be because they're catching up with Pixel, why wait for Pixel to surface this issue in the first place? Why not be proactive in pushing your color science further?
- Samsung work on their portrait mode edge detection every year and now their hair masking is stupidly good. Meanwhile almost all the iPhone portraits I take have the telltale giveaway of blurry hair.
- Samsung's colors are looking better year over year, they've reduced the over saturation whilst retaining really pleasant tones (especially in yellows and greends). Meanwhile the colors on my 13 Pro look mostly the same as the colors on my 11 Pro except there's a lot more oil painting effect on the 13.
- Night photos on iPhone still destroy the color palette by making everything a sickly yellow, Pixel on the other hand looks brilliant (in terms of color, their approach in image brightness is debatable).
^
The bottom line is Pixel and Samsung's year over year improvements are significantly more impactful than iPhone's. Perhaps in Apple's quest to retain "the iPhone look" (which to me is starting to be defined as an over processed oil painting look) they are less willing to take risks in adjusting colors and rendering algorithms, who really knows. There's really no excuse especially when you consider Apple are only making two phone a year (actually make that one phone a year given their new trend of using last year's Pro components in this year's regular series).
I think the sad reality is Apple are primarily competing with themselves now, AKA not competing at all, especially with their gen Z customers that are hooked into the iPhone ecosystem and will basically never leave. Apple don't have to push themselves to be better than Pixel and Samsung because all they have to do is ensure they're better than what they released 2-3 years ago in order to justify someone upgrading from one iPhone to another.
I think this night shot comparison from The Verge really sums everything up:
Pixel 6 Pro (below): The sky is a pleasant blue-grey. The buildings appear life like in color, contrast, and overall detail. Look at the lights within those buildings, they look realistic because they are not blown out or glowing. There is a perceivable element of depth to everything in the shot. The blue of the Ferris wheel is present without appearing overly saturated and bright. The domed roof on the building toward the right of the picture retains great color and contrast, in particular look at how nicely the red and pink hues are being rendered. The water is detailed but it's not screaming for attention, the reflections look great and the color and texture of the ripples appear lifelike too.
Overall everything is BALANCED, this image looks like a real city taken with a real camera. If you told me this was taken on an entry level compact mirrorless I would believe you honestly (unless I zoomed in).
^ Pixel 6 Pro
iPhone 14 Pro (below): WELCOME TO GOTHAM, everything has a sickly yellow glow to it. The sky looks like ash instead of evening clouds. The subtlety of the building colors are gone. The building windows look more like glowing yellow globs of light instead of actual windows. The Ferris wheel's blue has been completely nuked, much like the other colors in this image. Look at the domed roof on the right, overexposed and the red is completely washed out vs. the Pixel. Lest I forget is that water or is that crude oil I see? The color from the water is gone and the oversharpening is unflattering.
Overall this image is disgusting to look at, I mean that with no exaggeration. If I took this image I wouldn't ever want to look at it again nor share it with anyone. This looks like an image taken by an incapable camera that is overcompensating with over processing, it cannot even get basic white balance right for heaven's sake. If you told me this was taken on a budget Poco phone I would believe you. iPhones have been doing this yellow stunt for YEARS, how has upgrade cycle got anything to do with an image this poor when Pixel can clearly handle it!?
^ 14 Pro