Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

climber2020

macrumors newbie
Original poster
Sep 28, 2014
28
7
Update on 12-15-22: See post #30 on page 2. A comparison between my phone and another 14 Pro definitely shows that my camera is defective.

I've noticed that the corners, especially the lower right corner (when the phone is held in landscape orientation with a right hand shutter), is consistently soft & blurry on photos taken with my 14 Pro. Doesn't matter if it's HEIC, ProRAW, regular RAW, 12 MP, or 48 MP.

I haven't seen any other complaints about this but have noticed a similar issue looking at some 14 Pro pictures on Flickr.

Anyone else notice this problem? My guess is that it's either 1) a hardware defect involving the lens not being properly centered, or 2) this is just an inherent property of the camera's wide angle/larger sensor combo. I had the exact same issue with my 1st generation Sony RX100 years ago where it was just a property of how the camera was designed.

I'm trying to decide if it's worth it or not to exchange the phone for a new one, but there's no point if this is just how the camera is.

Attached are a few samples. In the vertical photo, the corresponding blurry corner is on the bottom left of the frame.


IMG_2838.jpeg
IMG_2906.jpeg
 
Last edited:
I noticed the same thing, but in my case, it is the top left corner which is blurry. It can be seen especially during documents scanning with an iPhone or taking photos of pages with text. I was wondering if it’s hardware issue (faulty iPhone) or a software bug. Did you replace your iPhone?
 
Is this on all lenses or just the 1x main camera?
If so, it’s because the image is being stitched together due to the new sensor.
If not, then it’s a lens issue, some lenses are soft at the edges but shouldn’t be blurry so to speak…

If in doubt test it in store against a demo model and see if it’s the same.
 
It’s how all lenses/sensors are. Every sensor displays this way, unless it’s a curved sensor like sony (I believe?) is developing.

TLDR: it’s physics, and inherent to the lens/sensor itself. I soot in 16:9 almost exclusively to avoid this, and also match how my eyes see the world, not to mention the screen shape on an iphone.
 
Sorry for the delayed reply; it was all crickets for a month and a half so I thought the thread had died.

It's just the 1x camera. When I flip the phone over and take the picture "upside down" so the shutter is with my left hand, the bottom right corner looks better.

The 3x telephoto lens is sharp as a tack from corner to corner. The ultrawide has blurry corners, but that's to be expected being an ultrawide.

My last phone was a 12 Mini, and the corners were pretty sharp.

My wife is having her 14 Pro delivered on Monday, so I'm gonna compare the two. If mine is indeed faulty, then I plan on sending it back to Apple to replace the camera module.
 
It’s actually lens coma. Take a photo of a brick wall and view the corners. All four corners could possibly have It. On dslr lenses, especially the fast ones (f1.4) for example, will show this effect when the lens is wide open. It’s common and a normal characteristic of fast lenses. However if just one corner shows it all the time, (your sample from what I can tell is a little more distorted than normal) then one or more optical elements in the camera could be slightly flawed. it’s also possible that the sensor is not directly centered in the module below the lens elements. See if it goes away or decreases when the iPhone is on a tripod or not handheld So the sensor shift or IBIS is not active. Also a star test would be the best way to see the coma… shoot a 10 second shot of the night sky and look at the stars in the corner. Present that to Apple and they will probably see it better. my two 14 pro max cameras have some lens coma, however one iPhone is a little sharper than the other when shooting the stars. But they are both very similar.
 
Last edited:
I figured it was either some optical aberration or a decentered lens.

I've looked at quite a few 14 Pro pictures on Flickr using the camera finder, and it seems like there's a lot of sample variation. Some photos look just like mine, others are worse where all the corners and even the edges are blurry, and some are sharp across the entire frame with mild blurring at the extreme corners only visible under high magnification.

My plan is to wait a few months before I consider sending it in for repair. With the current supply chain/production issues, I wouldn't be surprised if quality control isn't up to usual standards at the moment.
 
Here is an example of my iPhone 14 Pro Max at 48mp.
I am about 8 feet from the wall. I let the camera auto focus on its own to achieve this photo so it’s even. Normally I’ll touch the screen where I want the camera to focus and to meter the exposure of that area as well. If you always let the iPhone auto focus on its own, it’s always going focus on the closest thing to the camera first so make sure you focus on the object that you intend to have in focus. That might help with the corner clarity.


CBB504AA-17F7-40B2-812F-44962321AFF4.jpeg
 
Last edited:
I wonder does it make difference, but where was your phone assembled?
My regular 14 with such issue was assembled in India.
1670821920684.png
 
  • Wow
Reactions: JM
I've noticed that the corners, especially the lower right corner (when the phone is held in landscape orientation with a right hand shutter), is consistently soft & blurry on photos taken with my 14 Pro. Doesn't matter if it's HEIC, ProRAW, regular RAW, 12 MP, or 48 MP.

I haven't seen any other complaints about this but have noticed a similar issue looking at some 14 Pro pictures on Flickr.

Anyone else notice this problem? My guess is that it's either 1) a hardware defect involving the lens not being properly centered, or 2) this is just an inherent property of the camera's wide angle/larger sensor combo. I had the exact same issue with my 1st generation Sony RX100 years ago where it was just a property of how the camera was designed.

I'm trying to decide if it's worth it or not to exchange the phone for a new one, but there's no point if this is just how the camera is.

Attached are a few samples. In the vertical photo, the corresponding blurry corner is on the bottom left of the frame.


View attachment 2086166View attachment 2086167
can you please mark these areas? Can’t see it.
 
I think Macrumors adjusts the resolution/quality of the uploaded photos. Here is another example. The first one is the full photo, the 2nd is cropped to show the blurry corner.
IMG_3129.jpeg
tempImageOEMBva.png
 
I think Macrumors adjusts the resolution/quality of the uploaded photos. Here is another example. The first one is the full photo, the 2nd is cropped to show the blurry corner.View attachment 2127208View attachment 2127215
Yeah that’s perfectly normal. It’s called lens coma. Only your most expensive and fine optics will be extremely sharp all the way to the edge like Carl Zeiss for example. My Carl Zeiss, Milvus 50 & 85mm lenses for my Sony or Nikon camera were soft in the corners just like this and that was a $1800 85mm lens. The only one that didn’t have coma distortion in the corners was the Zeiss Otus which was a $5000 lens. So don’t worry about it, it’s not an optical flaw of any kind, it’s the nature of the optics. And it very well could be possible that the imaging sensor was moved a little bit to the right due to its imaging stabilization (IBIS) as the sensor will shift up and down, left and right within the image of that camera to reduce camera shake. And if it got too far to the right, then it will pick up some of the lens distortion like that. The new iPhone 14 pro series has a much larger sensor, it’s pretty big. Also with it being 48 megapixels, it’s going to pick up more optical flaws if any exist.
 
  • Like
Reactions: reinem85
Yeah that’s perfectly normal. It’s called lens coma. Only your most expensive and fine optics will be extremely sharp all the way to the edge like Carl Zeiss for example. My Carl Zeiss, Milvus 50 & 85mm lenses for my Sony or Nikon camera were soft in the corners just like this and that was a $1800 85mm lens. The only one that didn’t have coma distortion in the corners was the Zeiss Otus which was a $5000 lens. So don’t worry about it, it’s not an optical flaw of any kind, it’s the nature of the optics.

What's disappointing is that my last phone - the 12 Mini, was sharp across the entire frame (example below). Same with my XS before that. I'm fine with some softness in the extreme corners, but the blurring with the 14 Pro is noticeable in an 11x14 print; probably even an 8x10.


12 Mini sample:

12 Mini.jpeg



12 Mini crop of right lower corner:

12 Mini crop.jpg
 
What's disappointing is that my last phone - the 12 Mini, was sharp across the entire frame (example below). Same with my XS before that. I'm fine with some softness in the extreme corners, but the blurring with the 14 Pro is noticeable in an 11x14 print; probably even an 8x10.


12 Mini sample:

View attachment 2127343


12 Mini crop of right lower corner:

View attachment 2127344
The sensor in the mini and the XS are much smaller than the 14 pro, so it was catching the image from the center of the lens instead of at the edges. Much like a crop sensor DSLR using full frame glass. The 11 pro max has a sensor 42% larger than the XS and 12 pro. The 12 pro max was just a little larger than the 11 pro max, the 13 pro max has a sensor 52% larger than the 12 pro max, and the 14 pro / max has a sensor 62% larger than the 13 pro series. So you can see why the camera bump is bigger and the optics in the 14 pro probably barely cover the sensor especially when the sensor is shifting as it adjusts for camera shake. Maybe mount the iPhone on a tripod so the sensor will not move to adjust for camera shake and check the corner’s again. If there is blur in every corner and it looks the same, then most likely the sensor is centered below the optics. I this l that’s the best way to test. If the blur remains in the right corner, maybe contact Apple to see what they suggest.

So are you allowing the camera to focus automatically on its own or are you touching the touchscreen display to select where you want your camera to focus? Choosing where you want the camera to focus will probably make a big difference and how the camera behaves as far as image quality. Without touching the display and selecting where you want the camera to focus, it’s going to constantly focus and it will also focus on the closest object to the camera unless you select your preferred focus area.

I’ll have to test both of my 14 pro max iPhones to see the corners. I expect to see some coma but hopefully it’s not bad. I’ll test with auto focus only, and test with selective focus by touching the screen in the center to set focus at infinity.
 
Last edited:
Let's try some kind of blind test. Which picture is better and which one is worst?
X: https://postimg.cc/bdpJb9r5
Y: https://postimg.cc/DJg75R0M
Z: https://postimg.cc/4KGCMwDk
I don’t know if this would be considered a good test or not. The question I have is, was the camera focused at the same spot on each photo from each device by touching the same spot on the display of each iPhone to select the focus, or was the camera just aimed at the scene and the shutter button pressed allowing only autofocus? If a specific area of the scene was not selected by touch focus, then the end result is probably going to vary quite a bit.
 
And this one with files named accordingly to phone models.
Contrast of laptop screen exaggerates and amplifies all distortions of camera, meanwhile keyboard sits along the edge of frame, where aberrations are very visible too. Plus low light conditions.
iPhone 11 looks good in overall; iPhone 14 looks worse, and specially bad along edges; iPhone SE2 looks humble, but at least here are no outstanding distortions along edges.

In neighbouring topic I posted one less extreme example:
 
So are you allowing the camera to focus automatically on its own or are you touching the touchscreen display to select where you want your camera to focus?

For the pictures posted, it's all autofocus. At these distances, everything should be hyperfocal anyway. I have done tap to focus with the same results; the right lower corner is always more blurry than the rest.

Even shooting a subject where the entire frame is at infinity (like a mountain range with no foreground subject), that side is still consistently blurry.

I'll post an update once I do some testing with my wife's new 14 Pro.
 
The sensor in the mini and the XS are much smaller than the 14 pro, so it was catching the image from the center of the lens instead of at the edges. Much like a crop sensor DSLR using full frame glass.

But aren't the lenses in the 12 Mini and XS also proportionally smaller to match the sensor?

The lens itself (not the entire camera bump) on the 14 pro is huge compared to the lens on the 12 mini.
 
But aren't the lenses in the 12 Mini and XS also proportionally smaller to match the sensor?

The lens itself (not the entire camera bump) on the 14 pro is huge compared to the lens on the 12 mini.
Yes, and I think the bump on the 14 sticks out so much further, because they have to get that lens far enough away from the sensor so the image will cover the entire plane of the larger sensor.. at least that’s my guess anyway.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.