Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Grobaouche

macrumors member
Original poster
Oct 7, 2022
91
60
IMG_0740.png
IMG_0741.png
Capture d’écran 2024-02-28 à 15.16.45.png
Hello guys !

I was just taking some photo for my work and I realized something odd.

I took two photo of the exact same object, with the exact same pictures setting : main sensor, 1.2x, no zoom, no filters, nothing.

If you look at the far field one the colors are ok compared to real life (trust me) but when I take a picture up close (without triggering macro mode) we can clearly see how the colors shifted drastically and are completely different !

I managed to avoid that to force the brightness adjustment on the white floor instead on the jacket. Basically the iPhone 15 seems to favor the central object and tries to enlighten it but it completely shift the white balance !

Also the last picture we can see on the jacket that there are some color aberrations, like there are some marks on it that are absolutely not present in real life. How could be that the image processing is struggling so much ?

Thanks for your advices and tips !
 
Last edited:
It’s not the sensor. It’s the computing.

When framing the photo try tapping directly on the shirt and the hold and drag down until the color looks accurate to you?

I realize this isn’t a fix for auto problems, but should get you a correct photo.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.