I was just taking some photo for my work and I realized something odd.
I took two photo of the exact same object, with the exact same pictures setting : main sensor, 1.2x, no zoom, no filters, nothing.
If you look at the far field one the colors are ok compared to real life (trust me) but when I take a picture up close (without triggering macro mode) we can clearly see how the colors shifted drastically and are completely different !
I managed to avoid that to force the brightness adjustment on the white floor instead on the jacket. Basically the iPhone 15 seems to favor the central object and tries to enlighten it but it completely shift the white balance !
Also the last picture we can see on the jacket that there are some color aberrations, like there are some marks on it that are absolutely not present in real life. How could be that the image processing is struggling so much ?
Thanks for your advices and tips !
Last edited: