Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I've seen a few references here to using RAW so as to avoid the smartHDR. I'm afraid this may be a myth.

I took some photos in both 'normal' and 'RAW' format, in an artificial cave that was lit by a studio light. That light was not washed out in RAW, as it should have been. Instead, it had the same smartHDR effect as the 'normal' image, which turned the light source blue rather than the white it would have been without HDR - not to mention the 'ghosting' / diffusion that occurs (most visible on the rocks) because of the HDR effect.

Edit: I converted the RAW .dng to .jpg and added the file below. See what I mean about the massive amount of processing in that untouched RAW image? It looks identical to the 'normal' smartHDR pics I took in that same cave.

In other words, using RAW did *not* solve the HDR/processing problem.
 

Attachments

  • 1241.jpg
    1241.jpg
    484.3 KB · Views: 652
Last edited:
I just came here to post about this and found this thread.

It seems that our choice is:
  • Shoot in RAW with Halide, Camera+ 2, or another camera app when we want complete control. (As noted in the last post, ProRAW still applies some of Apple's proprietary/algorithmic adjustments, including SmartHDR.)
  • Shoot in ProRAW when I want some of Apple's AI/algorithm magic, but more control over what I'd have with HEIC/JPEG.
  • Shoot with HEIC when I want all of Apple's AI/algorithms.
I am upset that Apple took away the ability to turn off Smart HDR, but for now, I can live with this. Most of the time, in a scene where I know I'll want to edit later (either because it's a challenging scene with exposure, or because I want to make certain creative choices), I'll choose ProRAW or RAW anyhow.

In simpler and more straightforward shooting conditions, I will generally want the Smart HDR and other goodies that come with shooting in Apple's HEIC.

A note on ProRAW... if you import these files into Lightroom, you at least have *some* control over how the profile is applied. And you can also choose a different profile if desired. In the shots below, #1 has the default ProRAW profile. #2 has the ProRAW profile dialed down to "0" using the slider. #3 has one of the "Modern" profiles applied.

Screen Shot 2021-10-11 at 6.18.33 AM.png


Screen Shot 2021-10-11 at 6.19.53 AM.png


Screen Shot 2021-10-11 at 6.20.21 AM.png


So, we still have quite a bit of control over the ProRAW files in post-production.

I submitted feedback to Apple asking them to give us the ability to turn Smart HDR off again. You can submit feedback on the iPhone here.
 
  • Like
Reactions: seismicworm
Hi.
I just bought this phone coming from Samsung. I notice that scene detection is one of them. I don’t even have that option. I got an Pro max.
 
Last edited:
  • Like
Reactions: no0nefamous
I've seen a few references here to using RAW so as to avoid the smartHDR. I'm afraid this may be a myth.

I took some photos in both 'normal' and 'RAW' format, in an artificial cave that was lit by a studio light. That light was not washed out in RAW, as it should have been. Instead, it had the same smartHDR effect as the 'normal' image, which turned the light source blue rather than the white it would have been without HDR - not to mention the 'ghosting' / diffusion that occurs (most visible on the rocks) because of the HDR effect.

Edit: I converted the RAW .dng to .jpg and added the file below. See what I mean about the massive amount of processing in that untouched RAW image? It looks identical to the 'normal' smartHDR pics I took in that same cave.

In other words, using RAW did *not* solve the HDR/processing problem.

Are these using Pro Raw or a third party camera app?
 
So this seems like quite a serious issue. Well, it would be for me anyway, especially for a phone that costs £1,000. I'm just wondering why more people aren't mentioning it? There doesn't seem to be anything much I can see on YouTube and some of the photos in the Macro thread and the general iPhone 13 Pro/Pro Max photos thread look simply stunning. I thought perhaps something might appear on the front page of this website about the issue, but I've not seen anything.

I'm just trying to find out as much as I possibily can before my new iPhone 13 Pro arrives. It's on order with Apple and due for delivery between 3-10 November. I keep hoping I'll see that they'll be a fix/solution to come in a future iOS update. I'm reluctant to fork out so much money for a new phone if I'm going to be disappointed with the photos, especially as the cameras are the main reason I'm upgrading.
 
So this seems like quite a serious issue. Well, it would be for me anyway, especially for a phone that costs £1,000. I'm just wondering why more people aren't mentioning it? There doesn't seem to be anything much I can see on YouTube and some of the photos in the Macro thread and the general iPhone 13 Pro/Pro Max photos thread look simply stunning. I thought perhaps something might appear on the front page of this website about the issue, but I've not seen anything.

I'm just trying to find out as much as I possibily can before my new iPhone 13 Pro arrives. It's on order with Apple and due for delivery between 3-10 November. I keep hoping I'll see that they'll be a fix/solution to come in a future iOS update. I'm reluctant to fork out so much money for a new phone if I'm going to be disappointed with the photos, especially as the cameras are the main reason I'm upgrading.

The "problem" becomes apparent only in some cases, low light or HDR photos. Most of the time the photos look quite normal, unless you zoom in. If you like the photos in the photos thread, you'll be pleased. All iPhone 13s take the same photos.
 
I've seen a few references here to using RAW so as to avoid the smartHDR. I'm afraid this may be a myth.

I took some photos in both 'normal' and 'RAW' format, in an artificial cave that was lit by a studio light. That light was not washed out in RAW, as it should have been. Instead, it had the same smartHDR effect as the 'normal' image, which turned the light source blue rather than the white it would have been without HDR - not to mention the 'ghosting' / diffusion that occurs (most visible on the rocks) because of the HDR effect.

Edit: I converted the RAW .dng to .jpg and added the file below. See what I mean about the massive amount of processing in that untouched RAW image? It looks identical to the 'normal' smartHDR pics I took in that same cave.

In other words, using RAW did *not* solve the HDR/processing problem.

That is just a sensor data as a digital negative (DNG), you need to process (develop) the negative to make it a photo before we can judge it. All I see is a total lack of processing. The white balance is off, some of the colors are outside the range of displayable color spaces, its flat from lack of dynamic range, that stuff is making it look noisy, etc etc.

Smart HDR can be turned off in ProRAW, its just tags in the DNG that are associated with sliders. However the problem with that photo is its not a photo, its a negative.

ProRaw is still a RAW just with built in demosaic algorithm that includes scene range values (NOT the displays color space range), so embedded in there is data for the original dynamic range and original color values both of which are need to be adjusted for the color space and proper dynamic range.

Apples computational photography (namely Smart HDR and Deep Fusion) is required for several reasons. First, the point of computational photography makes up for the lack of hardware that doesn't fit on a phone. Second, it offers a way for consumers who want RAW photo capture but find it unrealistic to make the jump into editing RAW. Third, it picks up some of the heavy lifting normally done in post so it can still be accomplished on a mobile device.

Smart HDR, unlike the hysteria would lead everyone to believe is in charge of defining the values for the tags in the DNG. Local bright and shadow regions, local tone mapping, local linearization, local sharpening, and local exposure. Since the demosaic process is already done this Smart HDR data is non destructive to the image since its data separate of the sensors captured image. Smart HDR is just data connected to sliders in your editing suite of choice. If you don't want it use it then you don't have to use it, just set your sliders to zero. Although setting that data to zero doesn't leave you with much.

With Smart HDR and local tone mapping you not only have the RAW data to work with but computational photography data. Before on left after on right.
Screen Shot 2021-10-13 at 7.50.45 PM.png

Deep Fusion is reported to be used for denoise in low/medium light, in normal photo mode its prevents noise. More computational photography to make up for using the wrong camera hardware on the wrong shot. In my opinion though it does an amazing job.

Bayer RAW via Halide
Screen Shot 2021-10-13 at 11.17.40 PM.png

Apple ProRAW
Screen Shot 2021-10-13 at 11.17.27 PM.png

A bit tough to tell but look at all the noise on the stems near the dirt of the standard Bayer RAW versus the ProRAW. I got those images from this video. This is Josh Stunell he's a pretty good photographer recommend watching the video.


Anyway, Apple does a lot for us to make this easy but its still a RAW in a digital negative format that needs to be process, graded and exported as a photo.

My instagram is full of amazing iPhone 13 Pro pics captured in ProRAW from photographers around the world. Tons and tons of good photographer reviews I skimmed over to find the above pics. We all must just have got a bad batch of iPhones, nothing to do with skill or talent or the lack thereof, no sir...
 
  • Like
Reactions: sammjordan and coso
Maybe it's my old eyes and what I'm used to, but I much prefer the Halide results that the pro raw in that video. I'm used to seeing a level of noise, it looks more natural to me. I've been working with digital photography for over twenty years now as a graphic designer. In my job there are hundreds of times I've had to retouch photographs and one of the key steps I take is to match noise levels so it's something I have to be aware of. The smaller leaf below the others looks so unnatural to me, the lack of texture/noise just looks flat and cartoony, to my eye anyway. My eyes could be screwed, but that's what I'm seeing!

Screenshot-2021-10-14.jpg


The ghosted text here stood out to me.

I'm not an apple basher by any means, I'm a huge fanboy who learnt Photoshop on a Classic II and had been on the ride ever since! I just prefer the more conventional looking digital photos taken on my X.
 
Maybe it's my old eyes and what I'm used to, but I much prefer the Halide results that the pro raw in that video. I'm used to seeing a level of noise, it looks more natural to me. I've been working with digital photography for over twenty years now as a graphic designer. In my job there are hundreds of times I've had to retouch photographs and one of the key steps I take is to match noise levels so it's something I have to be aware of. The smaller leaf below the others looks so unnatural to me, the lack of texture/noise just looks flat and cartoony, to my eye anyway. My eyes could be screwed, but that's what I'm seeing!

View attachment 1864644

The ghosted text here stood out to me.

I'm not an apple basher by any means, I'm a huge fanboy who learnt Photoshop on a Classic II and had been on the ride ever since! I just prefer the more conventional looking digital photos taken on my X.

Don't know what you are talking about, your eyes are great! I glossed over that.

Personally to me that looks like the edge of iPhone 12 Pro WA lens and its getting blurry and a sort of coma aberration effect or something. Camera isn't moving for ghosting and its in both images. Look at the 'e' on the color checker in the Halide RAW, it's less noticeable but its has a halo over it. I think its just more pronounced in the ProRAW because it has less noise than it was over sharpened.

Same phone, same sensor, same photo suite....they should be able to be made virtually indistinguishable from each other.

I get what you are saying about the noise. While I dont think the Halide is too noisy I also don't think the ProRAW has been overly denoised. Subjective thing there. You can decrease the denoising in the ProRAW anyway if thats an issue.

But just for reference my post wasn't stock camera ProRAW vs Halide Bayer RAW. Halide supports ProRAW capture as an option now anyway. It was just to show its possible to use ProRAW just as well as you can use RAW if you try to.
 
Don't know what you are talking about, your eyes are great! I glossed over that.

Personally to me that looks like the edge of iPhone 12 Pro WA lens and its getting blurry and a sort of coma aberration effect or something. Camera isn't moving for ghosting and its in both images. Look at the 'e' on the color checker in the Halide RAW, it's less noticeable but its has a halo over it. I think its just more pronounced in the ProRAW because it has less noise than it was over sharpened.

Same phone, same sensor, same photo suite....they should be able to be made virtually indistinguishable from each other.

I get what you are saying about the noise. While I dont think the Halide is too noisy I also don't think the ProRAW has been overly denoised. Subjective thing there. You can decrease the denoising in the ProRAW anyway if thats an issue.

But just for reference my post wasn't stock camera ProRAW vs Halide Bayer RAW. Halide supports ProRAW capture as an option now anyway. It was just to show its possible to use ProRAW just as well as you can use RAW if you try to.
Makes me wish I'd gone for the pro, just to have that level of flexibility. I've had my refund from Apple for the 13 now, so I'm holding onto the cash until after Monday's event! It looks like the general consensus is there won't be a larger new iMac, but you never know…
 
Well, imagine my surprise when I “upgraded” from the 7 plus to the 13 pro max and discovered all of this over-sharpened hdr ********!! Pictures with low contrast look pretty good but that’s about it. I thought this would save me from needing to use my mirrorless camera all the time but, alas, I find these photos to be largely unusable. What a shock and a shame!
 
  • Like
Reactions: blspro
I do want to add that with good lighting conditions, the new phone is very capable of nice photos. But without the ability to control the application of apple’s processing, there is a lot of potential being wasted atm. Hopefully we get an update soon.

Here’s a few unedited shots from my morning walk I consider to be decent (usable):

C9A08DE6-51A4-4933-AE3A-113E98E1CBA5.jpeg

221BA48E-FDA9-436D-B4EF-6224ABEF61C9.jpeg

20634CAB-F952-4DA9-8552-4CDBA007FCDF.jpeg


And here’s where you see the photo being destroyed by processing
60855E3D-0ED8-4D71-B45A-6E862630FF13.jpeg
 
  • Like
Reactions: blspro
I hope that a software fix will come, i also noticed that the phone sometimes crops from the main camera instead of using the telephoto even during the day until i restart the app, pretty annoying
 
I hope that a software fix will come, i also noticed that the phone sometimes crops from the main camera instead of using the telephoto even during the day until i restart the app, pretty annoying
That's when you're too close for the 77mm lens to focus, it has a quite long minimum focusing distance
 
this is the awful sunrise with 13 pro max!! UGGG the sky was pink and orange NOT blown out red and its so over the top hot, my 7 plus NEVER did this to the sky - it is awful and I cannot believe I waited 5 years to upgrade only to have a worse camera :-(
 

Attachments

  • IMG_8604.jpg
    IMG_8604.jpg
    651.3 KB · Views: 311
  • IMG_8610.jpg
    IMG_8610.jpg
    492.7 KB · Views: 313
  • IMG_8601.jpg
    IMG_8601.jpg
    445.3 KB · Views: 309
  • Like
Reactions: rowi1de and blspro
Has anyone had any luck talking to Apple about the issue? Wondering if they are actually aware of the problem yet and working to fix it... I talked to someone from Apple on the phone a couple weeks ago and they weren't aware of the issue and couldn't give me any solution. :( ugh.
 
Another issue I've noticed is that when you are scrolling through your camera roll, if you tap a photo, it loads a blurry image first followed by the full res image. I know that it would do this sometimes with iCloud Photos enabled, but I don't use iCloud Photos. The photos should look good instantly, watching them load makes me feel like I'm using a cheap device.
 
  • Like
Reactions: Eekamouse and ediks
Submitted two feedback forms, one for the inconsistent results of HDR (ranging from blowing out images to not working at all) and the lack of option to turn it off, and another for the camera roll photos loading a blurry version first.

Another issue I have, although this has been persistent on the last several iPhones, is that by default the photos are often over-saturated. I find myself often having to drop the saturation. I know they do this because the average user is more impressed by bright, vivid photos, but if you have some photography experience, it stands out. The "Photographic Styles" option they added to the 13's, even in "Standard", looks too saturated and there should be a more neutral option.

For them hailing this as their "biggest camera upgrade ever", the camera results have been underwhelming to say the least.
 
For me it is hit and miss with this "new smart HDR". Problems arise with backlit photos. The effect is well detectable when you shot "Live Photos". Not that Live Photos are the problem, but when you scroll through a set of Live Photos a small part of the Live Photo is shown and then the "still" is shown. This still is then rendered in the horrible HDR version.

But for me it's mostly with backlit photos. I have to see how this SMART HDR and ALL the AI is doing on silhouet photos, where you basically don't want any HDR at all. And is all about getting your subject backlit.

(hope Apple will bring the smart HDR slider back for the iPhone 13 pro series, and stop being so confident on their "it just works" credo.
 
Last edited:
I think there are two better ways to contact Apple about this issue.
First, always file a Feedback Assistant issue.
It requires authentication with a valid Apple ID, not necessarily the one you are logged into Apple services on your phone, and it's free.
The trick is the app is hidden. To open it, you have to either be on Beta, OR type this into Safari:

a p p l e f e e d b a c k : / /

(without spaces)

Second, use the Apple Support app to chat while you are in the free support window after the purchase. Insist on the representative to forward the issue to the relevant team and give them the feedback assistant identification number that was assigned to the FA issue you opened. I've noticed during beta that it said "Recent similar reports: more than 10" when I cried about the design of the new Safari address bar, and the progressively brought it into almost usable shape after several beta iterations.

I've also seen about 6-7 ideas that I've submitted over the years eventually coming to a new iOS features (but no credit given and neither a simple reply was given)

I've got ZERO replies on complains and ideas sent via apple public no-authentication feedback website.

So use Feedback Assistant and flood Apple Support while you're entitled for free support.

I've read people complaining about difference in photos from previous models to 13, but imagine I've just jumped from SE 2016 to 13 mini and except for sensor stabilization and low light (which is by the way achieved with longer exposure plus sensor shift) and I still feel that I've just spent a lot of money on a very small difference!
 
  • Like
Reactions: Jake3231
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.