Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TripleP

macrumors member
Original poster
Mar 27, 2012
73
24
I've been reading the threads about the various video output settings and frankly it makes my head hurt. My question this - before I start watching content on my Apple TV is there a simple way to know which video setting is the best for that specific content? Even if I'm only looking at 4K titles, how do I know whether I should play HDR/SDR and which refresh rate to choose?

It's maddening that there's a "best available" setting for audio quality but not for video. All I want is to watch all of my content in the best possible format. Is there an easy way to do that without turning each viewing session into a research project or test driving each possible setting??
 
Welcome to the ATV4k! It doesn't auto switch so you have to do it. For 1080p content set it to SDR. Most of 4k content will have HDR. And then there's currently 2 types and the type you use depends on that and what your TV supports. Does your TV do Dolby Vision? If so use that. HDR but no dolby vision? Set it to HDR.
The refresh rate you use depends on what your TV supports in a particular mode. Use 24p if you can get away with it. Otherwise use whatever works for you.
 
  • Like
Reactions: dman67
In addition to the above, try some experiments to see whether the ATV or your TV set do a better job of upscaling the content.

ATV might upscale SD content better than your TV. But your TV may do a better job upscaling HD content.

Also, there is no 24Hz 1080p HD mode. Most movies are 24Hz. Therefore, your only option is to watch HD movies on 4K 24Hz setting, which may mean the upscaling might not be as good, but the motion handling might be better.

Avoid using DV or HDR modes on anything but 4K, and ONLY if the source content is encoded for it. If your TV supports both DV and HDR, make sure you're set for the proper encoding. Don't use DV if the movie is only encoded for HDR, and don't use HDR if the movie is listed as DV. There's some inconsistency as to how DV is handled on streaming media, and it may not include an HDR source if your TV doesn't support DV. Therefore, you may be better off watching it in SDR.

Good luck. Apple really made a mess of this!
 
In addition to the above, try some experiments to see whether the ATV or your TV set do a better job of upscaling the content.

ATV might upscale SD content better than your TV. But your TV may do a better job upscaling HD content.

Also, there is no 24Hz 1080p HD mode. Most movies are 24Hz. Therefore, your only option is to watch HD movies on 4K 24Hz setting, which may mean the upscaling might not be as good, but the motion handling might be better.

Avoid using DV or HDR modes on anything but 4K, and ONLY if the source content is encoded for it. If your TV supports both DV and HDR, make sure you're set for the proper encoding. Don't use DV if the movie is only encoded for HDR, and don't use HDR if the movie is listed as DV. There's some inconsistency as to how DV is handled on streaming media, and it may not include an HDR source if your TV doesn't support DV. Therefore, you may be better off watching it in SDR.

Good luck. Apple really made a mess of this!
This is my point exactly. It shouldn't take a research project to figure out the best picture setting every time I watch something. How Apple thinks this is better than a slight delay on auto setting is beyond me.
 
This is my point exactly. It shouldn't take a research project to figure out the best picture setting every time I watch something. How Apple thinks this is better than a slight delay on auto setting is beyond me.

My suggestion is to start with the basics. Just pick the best it offers and watch your normal content to see if notice anything distracting or that you do not like.

For me my TV compensates very well for most of the issues people have an issue with here. OR my eyes/brain can’t perceive what many can. Regardless it’s fine and I don’t need to switch between SDR and Hz and such.

If you can see something you don’t like verify there are settings in your TV to compensate. For example the 24p judder issue. With video interpolation my TV can deal with 24hz from a 60hz source very well without introducing SOE (soap opera effect). However the purist and/or someone hyper sensitive to judder may not find that good enough.

The brightness issue on the other hand is something I feel needs to be solved in a firmware update. Since Apple forces a faux HDR my TV won’t switch between its SDR and HDR profiles. This is fine however SDR is overly bright or HDR isn’t bright enough (depending on calibration).
 
This is what how I'm using it

SDR @ 4K SDR
HDR / DV @ 4k 24hz

I'm on lg C7. Any suggestions ?
 
As my TV supports 60hz OK (4:2:2), I have left it at that. I just toggle SDR/HDR as required. The results are good.

I think the box is smarter than it is believed.
 
  • Like
Reactions: BODYBUILDERPAUL
As my TV supports 60hz OK (4:2:2), I have left it at that. I just toggle SDR/HDR as required. The results are good.

I think the box is smarter than it is believed.
I actually think aTV 4K does a pretty decent job in remapping SDR content to HDR.
The simplest way to evaluate - use the built-in calibration screen (Settings > Audio&Video > Calibration > Color Bars, if memory serves) to tompare the brightness and contrast in both modes.
They look pretty similar on my screen in SDR vs HDR10 (Bravia ZD9).
 
  • Like
Reactions: BODYBUILDERPAUL
I actually think aTV 4K does a pretty decent job in remapping SDR content to HDR.
The simplest way to evaluate - use the built-in calibration screen (Settings > Audio&Video > Calibration > Color Bars, if memory serves) to tompare the brightness and contrast in both modes.
They look pretty similar on my screen in SDR vs HDR10 (Bravia ZD9).

I agree. I really don't see Apple offering auto switching between HDR and SDR. They were against that and it does not fit their ethos. What I see happening, is Apple tweaking and refining the SDR to HDR conversion. Reminds me of the iPhone 4 Retina display were some apps weren't Retina ready but look now :)
[doublepost=1508755625][/doublepost]
This is what how I'm using it

SDR @ 4K SDR
HDR / DV @ 4k 24hz

I'm on lg C7. Any suggestions ?
Sounds good if you only watch film (at 24p). Where you not happy with the results at the recommended 60p?
 
This is what how I'm using it

SDR @ 4K SDR
HDR / DV @ 4k 24hz

I'm on lg C7. Any suggestions ?

In Vincent's HDTVTest compare of UHD disc and ATV 4K he says the 2017 LG OLEDs can properly handle 60Hz input (from a 24Hz source file) and it looks just as smooth as using 24Hz. Not true for all TVs but apparently these do a good job. I trust Vincent's impressions. I also had been switching to 24Hz but honestly didn't notice a difference - both looked good. So based on this review I am now leaving it in 60Hz although I don't always like what the SDR->Dolby Vision does to skin tones and there is black crush introduced so I still switch between SDR and HDR modes as appropriate for source.
 
I actually think aTV 4K does a pretty decent job in remapping SDR content to HDR.
The simplest way to evaluate - use the built-in calibration screen (Settings > Audio&Video > Calibration > Color Bars, if memory serves) to tompare the brightness and contrast in both modes.
They look pretty similar on my screen in SDR vs HDR10 (Bravia ZD9).

My Bravia 900E is noticible dimmer when the ATV4k is in HDR mode. And that's the problem. Every TV is different. Yes they can tweak the output, but so far I'm not seeing anything close to the correct look of SDR content currently remapped to HDR. Since most streaming content is SDR, there is no reason for the ATV to crank everything up to HDR all the time. But I agree, I don't see Apple ever implementing native resolution.
 
My Bravia 900E is noticible dimmer when the ATV4k is in HDR mode. And that's the problem. Every TV is different. Yes they can tweak the output, but so far I'm not seeing anything close to the correct look of SDR content currently remapped to HDR.
Without proper calibration, it will be hit and miss anyway. Can also be, that your SDR is set to be overly bright. BTW out of the box, sony bravias are too bright and too blue. To the colorimeter, that is. Owner's eye may be perfectly happy with this.
BTW I've set my SDR and HDR modes for all inputs into Cinema Pro mode.
 
Without proper calibration, it will be hit and miss anyway. Can also be, that your SDR is set to be overly bright. BTW out of the box, sony bravias are too bright and too blue. To the colorimeter, that is. Owner's eye may be perfectly happy with this.
BTW I've set my SDR and HDR modes for all inputs into Cinema Pro mode.

Nope. I'm using Cinema Home and a similar Custom settings arrived at after much hit or miss experimenting with other users on AVS. I like warm displays, which is part of the problem. I've had the TV since March, plenty of time to figure out how it works. Apple TV 4K is the culprit here. It should 100% NOT be forcing signals into HDR which are not encoded for them, bypassing my TVs own native abilities. The 900E is inherently darker when displaying HDR images, compared to higher end TVs with brighter nits. In general I have no problem when watching content encoded for HDR, which tends to be movies under specific viewing conditions (darkened room, etc.). But since ATV4K insists on driving the TV in full HDR mode all the time, then I'm getting a darker picture in less than optimal conditions -- I.e. bright room during the day.

There's really no excuse for this. At a minimum, Apple really should have buried a native option in an advanced submenu if they didn't want the average customer to mess with it.
 
  • Like
Reactions: dman67
From reading reviews that say that LCD TVs are using their backlight on maximum when displaying HDR, i'm getting the impression that HDR/DV really is designed for OLED - future thinking I guess. For people who aren't happy with ATV doing HDR everything, surely with so few films in DV/HDR, it's just a case of leaving the ATV in SDR and revisiting the HDR/DV option in six months time when Apple hopefully will have tweaked and remapped the SDR to HDR conversion? Surely, if HDR is requiring the backlight on a LCD to be max, this is a really bad thing for the longevity of a television and makes me wonder if HDR was merely a marketing gimmick to sell LCD TV but isn't designed for long term use on it? Just my thoughts.
PS Not many brand new films are in HDR/DV here in the UK. I've pre-ordered three and no signs of them being in 4K. (although 2 of them are in the USA).
 
I did mention it in another thread somewhere, HDR10 and DV are both based on PQ transfer function, devised by Dolby Labs to best allocate available bits (10 in todays consumer devices, 12 in Dolby spec) in a way where contrast gradation steps remain below threshold of human vision (hence the name Perceptual Quatiser).
Important aspect of PQ transfer (gamma in older terms) is it's absolute nature - a digital intensity value of pixels is meant to produce a predefined luminance on screen. There are no 2 ways to interpret it!
A pixel value of 767 needs to be output at 100 nits brightness on any PQ-compliant display!

This has/should have the following effects:
- all calibrated HDR displays shall output same light intensity for given pixel value.
- this, obviously does not account for ambient light level in any way
- which in turn requires HDR PQ standard to prescribe ideal viewing environment - 5 nits ambient light level!

What I actually want to say is this - SDR material can also be properly displayed in HDR mode, without bad artefacts. One just needs to keep the upper brightness limited to 100nits (SDR mastering standard) and provide a colour transform into BT.2020 space and gamma transform into ST.2084 space (PQ), because PQ and BT.2020 are mandated by HDR standards and the display will switch to these characteristics when set into HDR mode.

PS As a side effect - because most HDR content is currently mastered with 4000nit display at best (brighter ones just do not exist yet), it also means, that code values (in currently used 10-bit encoding) from 920 (4000 nits) to 1023 (10000 nits) are simply unused in any HDR source, ie. codes are actually wasted, despite the opposite intent!
 
My suggestion is to start with the basics. Just pick the best it offers and watch your normal content to see if notice anything distracting or that you do not like.

For me my TV compensates very well for most of the issues people have an issue with here. OR my eyes/brain can’t perceive what many can. Regardless it’s fine and I don’t need to switch between SDR and Hz and such.

If you can see something you don’t like verify there are settings in your TV to compensate. For example the 24p judder issue. With video interpolation my TV can deal with 24hz from a 60hz source very well without introducing SOE (soap opera effect). However the purist and/or someone hyper sensitive to judder may not find that good enough.

The brightness issue on the other hand is something I feel needs to be solved in a firmware update. Since Apple forces a faux HDR my TV won’t switch between its SDR and HDR profiles. This is fine however SDR is overly bright or HDR isn’t bright enough (depending on calibration).

I agree for the most part. I've found once I got my TV settings just right and HDMI settings just right, there was no need for me to switch between SDR and HDR, that I could just leave it on the best possible settings that the Apple TV offers and everything looked fairly exceptional.
 
For streaming content like Netflix and iTunes, is there a way to tell what format the source material is in? In most cases the info just shows HD or 4K (sometimes HDR) but no refresh rate. If HDR is not indicated I'm assuming the material is SDR and would be best viewed in SDR rather than HDR setting. Is that right?
 
I agree for the most part. I've found once I got my TV settings just right and HDMI settings just right, there was no need for me to switch between SDR and HDR, that I could just leave it on the best possible settings that the Apple TV offers and everything looked fairly exceptional.

That is excellent news and I get the impression that the SDR to HDR is going to get truly better as Apple work on it. Certainly, that's what the guys at Apple told The Verge reviewer. Their words were "Our intention is simply to make SDR content look like SDR content in HDR mode".
(It's obvious that over critical Verge reviewer has done a lot of damage to Apple and the ATV 4K with that review as sadly, every other reviewer is taking his words as gospel).
[doublepost=1508869263][/doublepost]
For streaming content like Netflix and iTunes, is there a way to tell what format the source material is in? In most cases the info just shows HD or 4K (sometimes HDR) but no refresh rate. If HDR is not indicated I'm assuming the material is SDR and would be best viewed in SDR rather than HDR setting. Is that right?

You know what, follow your eyes on all of these settings. Just set it and leave it to the setting that looks right to you. Leave it on the ones that 1. Makes you happy & 2. You can forget about. Apples idea is for the TV to match the ATV e.g. 4K HDR 60 - see how you get on with that and play with the TV settings. (Don't overstrain that backlight on the TV though if you watch for many hours at a time). It reminds of the bass and treble controls were audio purists just had to leave them on flat which really does not work for me (it sounds dead, lifeless, boring) or dance/RnB music :)
 
  • Like
Reactions: piperatthegates
I agree for the most part. I've found once I got my TV settings just right and HDMI settings just right, there was no need for me to switch between SDR and HDR, that I could just leave it on the best possible settings that the Apple TV offers and everything looked fairly exceptional.

This really depends on your TV and what it's capable of. I'm glad you found settings that equalize it for you. However, while my Sony 900E has very good backlighting, HDR, tends to be a bit dimmer, which is fine when watching a movie in HDR in a dim room. However, it's terrible for daytime viewing of anything. Whether Apple intends to improve their SDR over HDR implementation, or not (and we all know of what the road to hell is paved), they can't possibly take this into account for every TV on the market. And many TVs are just not going to have the nits to run HDR full throttle in every circumstance. That's why at a minimum, Apple needs to offer native output as an option.

For streaming content like Netflix and iTunes, is there a way to tell what format the source material is in? In most cases the info just shows HD or 4K (sometimes HDR) but no refresh rate. If HDR is not indicated I'm assuming the material is SDR and would be best viewed in SDR rather than HDR setting. Is that right?

On iTunes no, there's precious little information. About the best you can do is swipe up and see the format you're receiving for that device. Netflix was giving a bit more information on my Android app, but not sure about the ATV app. If it doesn't say HDR or DV, then yes it's likely SDR. If it's a movie, then 24Hz is likely the best setting, as 24fps is the standard for films. TV is likely 30Hz, though some, especially those Netflix shows that shoot in Univisium, may also use a film frame rate. A lot of it will depend on your TV whether you need to be concerned about the frame rate. A 24fps film will look most natural at the native frame rate. At 30, or 60, some pulldown will be interjected that can make the film look off, particularly in panning shots. That's where your TV comes in. How effective it is at smoothing out the picture without producing a soap opera effect, will determine whether you can leave at at 60Hz for everything, or prefer switching it to 24Hz. So far, I have not been impressed by the ATV 4K upscaling capabilities, nor it's motion handling capabilities, compared to my TV. So, I've been switching to the proper native format (in some cases guessing), and let my TV do the work it was designed to do. In most cases, I personally think it looks better. In other cases, it's hard to know whether I've guessed at the native frame rate incorrectly, or the ATV is doing a better job.

Unfortunately Apple has not given us the choice, and seems intentionally preventing us from getting native formatting information to make manual adjustments. But in the end, its going to depend on what looks good to you, with the equipment you have.
 
  • Like
Reactions: dman67
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.