Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

HEEELP

macrumors newbie
Original poster
Dec 5, 2024
7
3
Hello and please help. I traded in my 13 Pro Max and assumed the 16 would be much better for close up pics of my babies and I take close up shoe pics but my pics are blurry and unfocused. I didn’t research before swapping but my phone doesn’t have the HDR option. Can someone help me out?

I didn’t use any apps on my 13 but as long as I was steady, my pics were pretty clear. The 16 I have to take 50 pics and in lucky if a few are kinda clear but still fuzzy.


PLEASE HEEEEEELLLLLPPPPO
 
Are the lens cleaned? The 16 shouldn’t be any worse than a 13 Pro. Is it the vanilla 16 ?

I need more detail about the shots: are those macros or made with the standard wide main camera ?
 
Last edited:
  • Like
Reactions: HEEELP and Parowdy
Are the lens cleaned? The 16 shouldn’t be any worse than a 13 Pro. Is it the vanilla 16 ?

I need more detail about the shots: are those macros or made with the standard wide main camera ?

Lens are clean, it’s a new phone. What is the vanilla 16? I don’t know about the macros or wide camera.

If I take a close up pic of anything, it’s not clear. I don’t use any apps or mess with settings. On my 13, I would hit photo and it was crystal clear. I tried to upload a pic on here but it said it was too large.

The food pic I took, some parts of the food are clear, some aren’t. I don’t know if it’s a settings issue or I’m just not well versed enough to know how to use this camera.
 
The newer iPhones have a larger sensor which makes the edges appear blurry sometimes.

Also, they have a longer focus distance so you have to be further away from the subject for it to focus.
 
Lens are clean, it’s a new phone. What is the vanilla 16? I don’t know about the macros or wide camera.

If I take a close up pic of anything, it’s not clear. I don’t use any apps or mess with settings. On my 13, I would hit photo and it was crystal clear. I tried to upload a pic on here but it said it was too large.

The food pic I took, some parts of the food are clear, some aren’t. I don’t know if it’s a settings issue or I’m just not well versed enough to know how to use this camera.
Vanilla iPhone 16 I meant the regular 16, not the 16 Pro.
By the way the main camera (the one you are basically using when zoom is set at 1X) is much better than your previous iPhone 13 Pro, so it’s probably just a matter of focus, unless you didn’t receive a defective unit.
Try to change a little bit the distance from your subject, or tap the screen in the exact point you want the camera to focus on, just before you shoot
 
  • Like
Reactions: HEEELP
The newer iPhones have a larger sensor which makes the edges appear blurry sometimes.

Also, they have a longer focus distance so you have to be further away from the subject for it to focus.

So I won’t be able to take close pics? Is there a setting for it?

Vanilla iPhone 16 I meant the regular 16, not the 16 Pro.
By the way the main camera (the one you are basically using when zoom is set at 1X) is much better than your previous iPhone 13 Pro, so it’s probably just a matter of focus, unless you didn’t receive a defective unit.
Try to change a little bit the distance from your subject, or tap the screen in the exact point you want the camera to focus on, just before you shoot

Oh, I’m sorry. I forgot to put Pro Max. I usually use at 1x, I just want to take clear up close pics.

Also, sometimes when I shoot videos, I see the HDR indicator and my videos look fine. Can I turn the HDR on for my photos? I know the 13 said HDR photos.

The portrait pics seem a little clearer than the standard photos but I don’t want to always take portrait photos.

When I take the portrait pic, it looks okay. Then I zoom in and it’s crystal clear, zoom out and it looks better. It’s almost like I have to zoom in and then zoom out so it can fully load.


Thanks for the help
 
Last edited:
The ultra wide camera is used for close up pics.

Do I need to turn that on every time I take a close pic? How do I make sure that’s on.

See, I just took this pic. I had to crop it so it wasn’t too big to upload.

I focused on the name chapstick and that isn’t crystal clear. Then the rest of the pic around it is blurry. Is this phone not capable of regular clear pics? Everything in the photo being clear.

IMG_4788.jpeg
 
The issue you are having is the depth of field in your new phone is much shallower because of a larger aperture and/or a larger sensor in the camera. The suggestion to back up and use the other lenses is a good one, or you could download a camera app that lets you control the aperture and set it to a smaller (f/larger number) aperture to increase depth of field. I am a professional photographer and when taking macro photos I generally use f/8 or even smaller.
 
OP might be referring to the bokeh effect that newer iPhones naturally produce and that is more pronounced than in older iPhones.

Else the ultra wide/macro lens has been accidentally disabled, so that the main camera is used to take close up photos. But I can't imagine how that could have happened if OP does not know all of this so that the macro control probably has never appeared in the camera app since that setting has never been toggled on.

🤷‍♂️
 
Last edited:
So I won’t be able to take close pics? Is there a setting for it?
Sometimes I just get a little further away and use the 2x lens. It feels counterintuitive but it works nicely, and as a bonus the background gets a little blurred (aka "bokeh") because of the optics -- independent of the artificial hit and miss Portrait Mode.
 
  • Like
Reactions: HEEELP
Like @Timpetus and @antiprotest mentioned above, from the two examples you posted, this is the natural bokeh effect of your newer phone having a larger sensor and a brighter lens than your previous one.

Bokeh is the blurry look of things in the foreground and background that are not the subject the lens is focused on. For example, a portrait where the subject is sharp but there is a soft and blurry background.

The bokeh blur is most noticeable under a few conditions:
  1. A larger sensor tends to show more blur of out-of-focus items than a smaller sensor under the same conditions. You have gone to a phone with a larger sensor.
  2. A larger aperture (the opening in the lens that lets the light through) will provide a more shallow area of sharpness, with the point of focus being sharp and a more dramatic fall off of sharpness of foreground and background elements, than a smaller aperture. The variety of lenses you could be using on these two different cameras have different apertures, so without a (sorry) apples-to-apples comparison of photos shot the same way with the two phones, it's hard for me to tell which aperture is being used. But even the same aperture being used in both cameras under the same conditions will result in a shallower depth of field (d.o.f. is items that are acceptably sharp in front of and behind the point of focus) with the new, larger sensor camera. In other words, if you were to put the 13 Pro Max and 16 Pro Max next to each other and shoot the same thing the same way, it's likely that the 16 would have a more shallow depth of field, with more of the blurriness of the things that aren't the point of focus.
  3. Distance from camera to subject. You are shooting closeups, and the falloff from sharp to out of focus is more exaggerated when your camera is closer to your subject. That's why some replies above have suggested you take pictures from slightly further away if you want everything sharp.
  4. Telephoto lenses tend to show more exaggerated bokeh than the same subject, same settings, as photos shot with a wider lens. So try using a wider lens for closeups.
The 16 Pro Max has a macro setting. Make sure you are using that when you take closeup photos. Here's a quick video that has several tips for macro photography on your new iPhone. I hope this helps!

 
Before you take the picture, is there a flower symbol in the bottom left corner in yellow?

I too was confused taking some pictures with my 15 Pro and comparing them to my 11 Pro. For some regular to „mid-close“ pictures it would use the macro mode automatically, turning that off revealed that it often would take worse pictures, because it uses the ultra wide camera but zoomed in as if using the main camera, instead of the main camera. Ruined a couple of shots before I realized.
 
Last edited:
My iphone 16 pro max's ultrawide camera doesn't focus 9/10 times. Not sure if it's a bug or some hardware issues. It's like the ultrawide lens only had a fixed focus distance and autofocus doesn't work. I was so disappointed at the new iPhone cameras that I picked up a proper full frame mirrorless for Black Friday sale haha
 
  • Like
Reactions: Eric-H
The issue you are having is the depth of field in your new phone is much shallower because of a larger aperture and/or a larger sensor in the camera. The suggestion to back up and use the other lenses is a good one, or you could download a camera app that lets you control the aperture and set it to a smaller (f/larger number) aperture to increase depth of field. I am a professional photographer and when taking macro photos I generally use f/8 or even smaller.
This above is the technical explanation of what the OP is reporting. The best solution is to manually change the aperture (if there is enough light) to 8 or more.

Just a side note: there are many good apps for that but you don’t really need it. You can change aperture even with the native camera app of iOS (and you can do that with the camera button)
 
or you could download a camera app that lets you control the aperture and set it to a smaller (f/larger number) aperture to increase depth of field. I am a professional photographer and when taking macro photos I generally use f/8 or even smaller.

This above is the technical explanation of what the OP is reporting. The best solution is to manually change the aperture (if there is enough light) to 8 or more.

Just a side note: there are many good apps for that but you don’t really need it. You can change aperture even with the native camera app of iOS (and you can do that with the camera button)
It’s worth noting that the aperture on each iPhone camera is fixed, it can’t be physically changed like a traditional camera lens. The depth setting in the camera app is software-only. Because Portrait Mode is technically always on in iPhone 15 and higher models you can adjust the software-based bokeh effects using virtual f/ stops. Selecting a higher aperture will reduce or remove the software blur, but there will still be natural bokeh due to the size of the sensor and fixed aperture. As others have said the best solution is to switch to a longer focal length lens and physically back up to increase the depth of field.
 
  • Like
Reactions: UpsideDownEclair
I’ve tried the back away and zoom in
Like @Timpetus and @antiprotest mentioned above, from the two examples you posted, this is the natural bokeh effect of your newer phone having a larger sensor and a brighter lens than your previous one.

Bokeh is the blurry look of things in the foreground and background that are not the subject the lens is focused on. For example, a portrait where the subject is sharp but there is a soft and blurry background.

The bokeh blur is most noticeable under a few conditions:
  1. A larger sensor tends to show more blur of out-of-focus items than a smaller sensor under the same conditions. You have gone to a phone with a larger sensor.
  2. A larger aperture (the opening in the lens that lets the light through) will provide a more shallow area of sharpness, with the point of focus being sharp and a more dramatic fall off of sharpness of foreground and background elements, than a smaller aperture. The variety of lenses you could be using on these two different cameras have different apertures, so without a (sorry) apples-to-apples comparison of photos shot the same way with the two phones, it's hard for me to tell which aperture is being used. But even the same aperture being used in both cameras under the same conditions will result in a shallower depth of field (d.o.f. is items that are acceptably sharp in front of and behind the point of focus) with the new, larger sensor camera. In other words, if you were to put the 13 Pro Max and 16 Pro Max next to each other and shoot the same thing the same way, it's likely that the 16 would have a more shallow depth of field, with more of the blurriness of the things that aren't the point of focus.
  3. Distance from camera to subject. You are shooting closeups, and the falloff from sharp to out of focus is more exaggerated when your camera is closer to your subject. That's why some replies above have suggested you take pictures from slightly further away if you want everything sharp.
  4. Telephoto lenses tend to show more exaggerated bokeh than the same subject, same settings, as photos shot with a wider lens. So try using a wider lens for closeups.
The 16 Pro Max has a macro setting. Make sure you are using that when you take closeup photos. Here's a quick video that has several tips for macro photography on your new iPhone. I hope this helps!


Close ups, I usually do the 1x. I feel like if I back up and do a 2x, it doesn’t look good. I’ll turn the Macro on and watch the video you sent, thanks.

This above is the technical explanation of what the OP is reporting. The best solution is to manually change the aperture (if there is enough light) to 8 or more.

Just a side note: there are many good apps for that but you don’t really need it. You can change aperture even with the native camera app of iOS (and you can do that with the camera button)

How do I change the aperture on the phone?

Thanks

It’s worth noting that the aperture on each iPhone camera is fixed, it can’t be physically changed like a traditional camera lens. The depth setting in the camera app is software-only. Because Portrait Mode is technically always on in iPhone 15 and higher models you can adjust the software-based bokeh effects using virtual f/ stops. Selecting a higher aperture will reduce or remove the software blur, but there will still be natural bokeh due to the size of the sensor and fixed aperture. As others have said the best solution is to switch to a longer focal length lens and physically back up to increase the depth of field.

Do you have the option on this phone of selecting which lens you want to use?

Thanks
 
You cant change the aperture for the camera, they are all fixed. You can only change the simulated aperture when a photo has portrait mode activated...
 
  • Like
Reactions: zkap
Thanks to everyone explaining how the camera works on the newer phones. I think it says a lot about how in line this is(n't) with the "it just works" motto when us average users need an explanation for wtf is going on with our photos.

If I had a choice between the camera on the 13 Pro and the 15 Pro, I'd choose 13 Pro any day. They gave the 15 Pro a camera that is technically better, but not only does it not look better for the average user, sometimes it looks worse.

Even though my issue is not about photos, it has to do with the camera system - I can no longer scan documents with my 15 Pro because of the depth of field change, as the letters that are not in the center of the document are slightly out of focus and uncomfortable to look at. So the "advancements" to the 15 Pro camera now have me scanning images with my iPad instead. This is a ridiculous downgrade as the camera is not used for photos exclusively, and Apple had to have known that they were effectively ruining document scanning with these changes, which is a very strange decision given that the Notes app offers a built-in scanning feature.

Not saying you can't get better photos by knowing which lens to use and how far away to back up from the subject, but we shouldn't have to do this to avoid parts of the photo being just outright blurry. Is there a third-party camera app that reduces these effects without having to tinker with the camera settings?
 
  • Like
Reactions: whrage
You cant change the aperture for the camera, they are all fixed. You can only change the simulated aperture when a photo has portrait mode activated...
The “simulated “ aperture works quite well, and you can change f stops without selecting portrait mode
Thanks to everyone explaining how the camera works on the newer phones. I think it says a lot about how in line this is(n't) with the "it just works" motto when us average users need an explanation for wtf is going on with our photos.

It just works is a joke. Even an average user need to know photography basics in order to obtain results
If I had a choice between the camera on the 13 Pro and the 15 Pro, I'd choose 13 Pro any day. They gave the 15 Pro a camera that is technically better, but not only does it not look better for the average user, sometimes it looks worse.

The camera is better in every way.
Even though my issue is not about photos, it has to do with the camera system - I can no longer scan documents with my 15 Pro because of the depth of field change, as the letters that are not in the center of the document are slightly out of focus and uncomfortable to look at. So the "advancements" to the 15 Pro camera now have me scanning images with my iPad instead. This is a ridiculous downgrade as the camera is not used for photos exclusively, and Apple had to have known that they were effectively ruining document scanning with these changes, which is a very strange decision given that the Notes app offers a built-in scanning feature.

Not saying you can't get better photos by knowing which lens to use and how far away to back up from the subject, but we shouldn't have to do this to avoid parts of the photo being just outright blurry. Is there a third-party camera app that reduces these effects without having to tinker with the camera settings?
If you have issues in document scanning, your iPhone is not working properly .
Are you sure you are using the right procedure ? To scan a document you don’t need to make a photo

 
If you have issues in document scanning, your iPhone is not working properly .
Are you sure you are using the right procedure ? To scan a document you don’t need to make a photo
I think it's impressive that you can tell my iPhone is not working properly right off the bat.

I've been using the Microsoft Lens app for years, works great on every device except the 15 Pro. I work from home so I use my iPhone and iPad for scans, I know what kind of results I'm getting. That effect of blurring the edges of the shot happens while scanning documents, as well, since apperture is software-only, so the text on the top and/or bottom of the page will be ever so slightly blurred no matter what I do. Perfectly legible, but at least a bit blurred/out of focus nonetheless, while the text in the middle is sharp.
The scanning feature in the Notes app is far worse than Lens, because Lens at least allows you to zoom in and out, so as to influence the lenses used. The scan is passable as all of the text can be read, but still not as good as the 13 Pro, which didn't have these focus issues.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.