Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
But hmm. I did try to replicate the same scene on my phones (16 Pro Max and 17 Pro Max) and both show some odd artifacts that I haven't seen before.

It seems that the complete dark room in combination with the bright light shining out though the gaps in the door create a very tricky condition for the camera.

I've shot thousands of astro shots with this 16 Pro Max and never seen any such artifacts before.

My new 17 Pro Max is probably a little better.


Apple iPhone 17 Pro Max (Stock camera, 1x, 24mm, 2s, f/1.7, ISO 6400, 12.19MP)


Apple iPhone 16 Pro Max (Stock camera, 1x, 24mm, 2s, f/1.7, ISO 6400, 12.19MP)
Looks like processing (SW) issue - in my case they have ruined it since iOS 18.2 and never patched it since. Strange that your astrophotography comes out alright though.
 
Looks like processing (SW) issue - in my case they have ruined it since iOS 18.2 and never patched it since. Strange that your astrophotography comes out alright though.

I think it is some kind of scene detection that really tries to boost shadows too aggressively.

Most likely the reason why some users now see the blue streaks/clouds in pitch black on units that was previously unaffected.
 
  • Like
Reactions: nickelro
Apple engineers might have to tweak / fine tune the new camera sensors for the upcoming 18 series and iOS 27, so fingers crossed they will eventually nail the issue next year.
 
Apple engineers might have to tweak / fine tune the new camera sensors for the upcoming 18 series and iOS 27, so fingers crossed they will eventually nail the issue next year.

Hopefully they will upgrade the hardware and that should force them to recalibrate the software too.

Apple night mode is very aggressive and boost shadows more than most other phones I've tried. Probably a little too much for the hardware.
 
  • Like
Reactions: nickelro
Hopefully they will upgrade the hardware and that should force them to recalibrate the software too.

Apple night mode is very aggressive and boost shadows more than most other phones I've tried. Probably a little too much for the hardware.
Okay, now I got lost. I was about to wipe my phone and return it today. Now I am not sure I should because if it’s a software issue then it makes no sense to swap it for a new unit. And btw I just asked my colleague to perform that famous tabletop test with his iPhone 15. There is no ProRAW on the iPhone 15. Not Pro, not 16 or 17 series. Results are attached below. Is it happening to all iPhone models?
IMG_5570.jpeg
 
Okay, now I got lost. I was about to wipe my phone and return it today. Now I am jot sure I should because if it’s a software issue then it makes no sense to swap it for a new unit

I think your photos look a bit worse than mine. The blue streaks are more visible. However our testing is far from scientific and my unit could very well show the same artifacts in your scene.

My advice is to take some real night shots, perhaps on the night sky or a setting you would shoot normally (and not just for testing purposes). If you see the blue streaks in these shots then I would surely send it back.
 
Okay, now I got lost. I was about to wipe my phone and return it today. Now I am not sure I should because if it’s a software issue then it makes no sense to swap it for a new unit. And btw I just asked my colleague to perform that famous tabletop test with his iPhone 15. There is no ProRAW on the iPhone 15. Not Pro, not 16 or 17 series. Results are attached below. Is it happening to all iPhone models?
View attachment 2562596
Yes, it is sadly...
 
I think your photos look a bit worse than mine. The blue streaks are more visible. However our testing is far from scientific and my unit could very well show the same artifacts in your scene.

My advice is to take some real night shots, perhaps on the night sky or a setting you would shoot normally (and not just for testing purposes). If you see the blue streaks in these shots then I would surely send it back.
That’s what I am trying to say. I believe all my tests are flawed because of lack of tech knowledge in photography. Night mode pictures taken in my city look pretty good. But they all have maximum ISO 2500 with 3s exposure.
I can take a picture of the skies tonight if the weather is fine, but I have Bortle class 6 in my area. This is the lowest I can find.
 
Last edited:
That's what I'm
But hmm. I did try to replicate the same scene on my phones (16 Pro Max and 17 Pro Max) and both show some odd artifacts that I haven't seen before.

It seems that the complete dark room in combination with the bright light shining out though the gaps in the door create a very tricky condition for the camera.

I've shot thousands of astro shots with this 16 Pro Max and never seen any such artifacts before.

My new 17 Pro Max is probably a little better.


Apple iPhone 17 Pro Max (Stock camera, 1x, 24mm, 2s, f/1.7, ISO 6400, 12.19MP)


Apple iPhone 16 Pro Max (Stock camera, 1x, 24mm, 2s, f/1.7, ISO 6400, 12.19MP)
That's what I have noticed on my 16 Pro too (when I'm doing this test). Didn't remember looking like this.
 
I think it is some kind of scene detection that really tries to boost shadows too aggressively.

Most likely the reason why some users now see the blue streaks/clouds in pitch black on units that was previously unaffected.
Could you please elaborate on that?

I understand it so if you shoot ProRAW then there should be no post-processing. Or am I missing something?

Also, while reading photo.stackexchange.com, I see some explanation of ISO and SNR. The darker the scene the lower SNR you get. All the sensors have some noise. When the exposure is completed, the camera will perform a read-out from the sensor. ISO is applied after. Even if you fully cover the lens and take a picture the values will never be 0 on the sensor but close to 0. It is called bias. Modern cameras can offset the bias before writing the data. Maybe Apple’s software doesn’t do it correctly?
Thermals can affect the end picture. I noticed that too.
But the most important - high ISO doesn’t cause noise because it is a sort of digital amplifier applied to the result after a read-out in an attempt to get the best picture. The noise is always there with or without high ISO.

I learned a lot of stuff today))
But this all makes me think that it might be a defect sensor. At the same time it might also be a software. But as far as I understand, RAW should not have any post-processing.
I might be wrong, feel free to correct me.

Update: it seems like Apple uses post-processing for ProRAW. This sentence states it:

”Apple ProRAW combines the information of a standard RAW format along with iPhone image processing…”

 
Last edited:
  • Like
Reactions: BorisDG
So everything points towards software/processing?
Of course it may be a sensor fault. Somehow it accumulates more noise.
But why are only the dark parts of the image prone to having blue/red/purple streaks? If that was a sensor issue wouldn’t it be on the same locations on every photo?
If iphone does such an aggressive post-processing for shadows even in ProRAW then it’s a software issue.
I saw people took RAW pictures with 3rd party apps and got different results compared to the stock iphone camera.
Can anyone confirm it? Are there any limitations for 3rd party apps using Apple APIs for camera?
 
Of course it may be a sensor fault. Somehow it accumulates more noise.
But why are only the dark parts of the image prone to having blue/red/purple streaks? If that was a sensor issue wouldn’t it be on the same locations on every photo?
If iphone does such an aggressive post-processing for shadows even in ProRAW then it’s a software issue.
I saw people took RAW pictures with 3rd party apps and got different results compared to the stock iphone camera.
Can anyone confirm it? Are there any limitations for 3rd party apps using Apple APIs for camera?
I can confirm that the issue is not present while using the ProCam app for long exposures in a pitch black room.
 
  • Like
Reactions: GloryNox
Could you please elaborate on that?

I understand it so if you shoot ProRAW then there should be no post-processing. Or am I missing something?

Also, while reading photo.stackexchange.com, I see some explanation of ISO and SNR. The darker the scene the lower SNR you get. All the sensors have some noise. When the exposure is completed, the camera will perform a read-out from the sensor. ISO is applied after. Even if you fully cover the lens and take a picture the values will never be 0 on the sensor but close to 0. It is called bias. Modern cameras can offset the bias before writing the data. Maybe Apple’s software doesn’t do it correctly?
Thermals can affect the end picture. I noticed that too.
But the most important - high ISO doesn’t cause noise because it is a sort of digital amplifier applied to the result after a read-out in an attempt to get the best picture. The noise is always there with or without high ISO.

I learned a lot of stuff today))
But this all makes me think that it might be a defect sensor. At the same time it might also be a software. But as far as I understand, RAW should not have any post-processing.
I might be wrong, feel free to correct me.

Update: it seems like Apple uses post-processing for ProRAW. This sentence states it:

”Apple ProRAW combines the information of a standard RAW format along with iPhone image processing…”


ProRAW is a post processed (fake) RAW format. It does multi frame merging and also applies noise reduction and sharpening to image.

Basically ProRAW and HEIC/JPG has the same post processing.

Luckily, Apple Camera API support Regular RAW (sometimes called Beyer RAW) that is absolutely untouched. You can shoot in this format with third party apps like ProCamera or Halide.

--

The main issue is that camera sensors always produce this kind of blue/green/red static color when pushed really hard with high ISO and shadows are raised.

If I shoot the Canon 5Ds in a dark room and then raise the shadows I get the exact same blue tint as some examples here from the iPhone table top test.

If I do it with a Nikon D800 I instead get a completely red tinted image.

But it does seem that different iPhones can handle different amounts of raised shadows until the blue streaks are visible. It is also possible that Apple has tweaked the amount of shadow boost and that made the issue worse too.
 
ProRAW is a post processed (fake) RAW format. It does multi frame merging and also applies noise reduction and sharpening to image.

Basically ProRAW and HEIC/JPG has the same post processing.

Luckily, Apple Camera API support Regular RAW (sometimes called Beyer RAW) that is absolutely untouched. You can shoot in this format with third party apps like ProCamera or Halide.

--

The main issue is that camera sensors always produce this kind of blue/green/red static color when pushed really hard with high ISO and shadows are raised.

If I shoot the Canon 5Ds in a dark room and then raise the shadows I get the exact same blue tint as some examples here from the iPhone table top test.

If I do it with a Nikon D800 I instead get a completely red tinted image.

But it does seem that different iPhones can handle different amounts of raised shadows until the blue streaks are visible. It is also possible that Apple has tweaked the amount of shadow boost and that made the issue worse too.
Thank you for such detailed explanation and overview.
 
ProCam: “Slow shutter” mode, Low Light, ISO 12500, exposure 30 sec.
Great result!
The only difference I can see is that you have ev value 0 whereas stock camera gives usually 3.7-4 ev in complete dark conditions. Not sure if it matters in this case.
 
  • Like
Reactions: nickelro
Great result!
The only difference I can see is that you have ev value 0 whereas stock camera gives usually 3.7-4 ev in complete dark conditions. Not sure if it matters in this case.
Yup, I noticed that too. Will look into it these days, maybe there is a way to manually increase the exposure value to match the ones from the default Camera app.
 
  • Like
Reactions: GloryNox
Yup, I noticed that too. Will look into it these days, maybe there is a way to manually increase the exposure value to match the ones from the default Camera app.

If you do that, you will enhance artifacts that are not meant to be seen. That is the reason why the tabletop test does not work. The camera will attempt to recover detail in complete darkness and just show the sensor interference instead.
 
If you do that, you will enhance artifacts that are not meant to be seen. That is the reason why the tabletop test does not work. The camera will attempt to recover detail in complete darkness and just show the sensor interference instead.
I don't remember how the EV values looked like on the tabletop test photos prior of iOS 18.2.
 
Well, seems like IP17Pros are also affected huh. I just returned from a shop and messed around with the phones and both Pro and Pro Max had this issue. Before we say its not a relevant test let me say the 17 Air and 17 plain do not have this issue on the same test. Definitely something hardware related, together with the SW processing.

I live in the equator with plenty of light polution AND cloudy cover around 90% of the time and I've given up on astro photography or even astronomy altogether.
But, should there arise a time and opportunity to see the Aurora you bet your last dollar I'm not going with the Iphone (at least not with it alone).

I'm already planning my move back to Android land, recently both Oppo and Vivo have shown extremely good chops in photography and it looks like Apple software is really showing its age with the lousy AI integration. There seems to be more than 1 factor at play for gravitating towards Android land again!!
 
  • Like
Reactions: ToddH
Well, seems like IP17Pros are also affected huh. I just returned from a shop and messed around with the phones and both Pro and Pro Max had this issue. Before we say its not a relevant test let me say the 17 Air and 17 plain do not have this issue on the same test. Definitely something hardware related, together with the SW processing.

I live in the equator with plenty of light polution AND cloudy cover around 90% of the time and I've given up on astro photography or even astronomy altogether.
But, should there arise a time and opportunity to see the Aurora you bet your last dollar I'm not going with the Iphone (at least not with it alone).

I'm already planning my move back to Android land, recently both Oppo and Vivo have shown extremely good chops in photography and it looks like Apple software is really showing its age with the lousy AI integration. There seems to be more than 1 factor at play for gravitating towards Android land again!!
That's exactly what I did! I purchased two Samsung S25 Ultra phones. It's a nice change over iOS that I've used since 2009. The cameras are the reason I switched (especially for expert raw). Full DSLR-like control of the cameras. I almost chose the Pixel 10 pro, but the 200mp camera on the S25 Ultra along with Expert Raw is what sold me. Too much AI with Apple. I didn't (dont) use any of it anyway.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.