Hi all,
I apologize for the technical nature of this thread, but I've been trying to figure this out for two days, and I still can't find a 100% answer despite some excellent threads I've read, some of which seem to contradict.
A little background on my setup, but hopefully others will find this discussion useful. I'm running a LG 65E6P with a Marantz SR7011 receiver. Originally I had the AppleTV going through the receiver, but due to a problem with Dolby Vision at 24Hz and poor audio sync, I have the AppleTV connected directly to the TV and am using ARC (which seems like it's working properly).
I'm aware that it's always best to output the source as something that goes evenly into the TV's refresh rate. Ie, for a Blu-Ray player that outputs 24fps, a 24Hz refresh is perfect. But I'm not sure it's that simple with a streaming box like AppleTV 4K, because there's no way to tell exactly what the source format is, as far as I can tell anyway. I am aware you have to manually set the output if you want to change refresh and between HDR and Dolby Vision. What I don't know is if, when I set the refresh rate to 24Hz, if it's trying to play a 24fps source "natively", or if it's taking a source that should be at 24, upconverting it the preferred Apple format of 60, then downconverting it AGAIN back to 24, which would seem weird, but I could see it possible since Apple seems to really want 60Hz as its ideal setting... If the latter was true, it would mean refresh rate wouldn't matter much and the fastest available should always be used.
Some specific examples to illustrate what I'm trying to determine:
1.) I play a movie I rented or bought on iTunes in 4K w/ Dolby Vision. My choices for Hz are 30, 25, and 24. Now, I know a Blu-Ray would be at 24Hz. So setting it to 24 should result in the least judder. BUT, I can't find confirmation of what the video SOURCE file is. I've seen some things that made me think that everything iTunes sends is at 60 and the AppleTV does the down-conversion, in which case, would setting it to 24 even help in any way? Or am I actually getting a 24fps signal that works with the 24Hz refresh rate optimally?
2.) The same question, but with Netflix. I found a neat trick on the computer to see frames sent if I watch Netflix in a browser. (Try CTRL-ALT-SHIFT-D while watching something.) That seems to indicate ~24fps for the content I tried. Does this equate to the AppleTV app? If so, it would again mean 24Hz is the best setting, but again, I don't know if somehow behind the scenes the AppleTV receives all sources in 60 or converts to that before outputting at whatever refresh rate is chose (I wouldn't think so tho'.).
3.) Basically the same as the other two, but for HDR where 60Hz is available. I know some TV's double the 60 to 120 so 24 can fit in evenly, but does it still make sense to watch at 24 where available, or again, is moot because of what the AppleTV is doing behind the scenes with the source?
4.) For SDR, since the choices are only 50 or 60Hz, is it best to just stick to the 60? Any benefit going for 50 on 24Hz sources like movies, since it's closer to 48?
Thanks for anyone who can shed any light on this! It's driving me nuts!
I apologize for the technical nature of this thread, but I've been trying to figure this out for two days, and I still can't find a 100% answer despite some excellent threads I've read, some of which seem to contradict.
A little background on my setup, but hopefully others will find this discussion useful. I'm running a LG 65E6P with a Marantz SR7011 receiver. Originally I had the AppleTV going through the receiver, but due to a problem with Dolby Vision at 24Hz and poor audio sync, I have the AppleTV connected directly to the TV and am using ARC (which seems like it's working properly).
I'm aware that it's always best to output the source as something that goes evenly into the TV's refresh rate. Ie, for a Blu-Ray player that outputs 24fps, a 24Hz refresh is perfect. But I'm not sure it's that simple with a streaming box like AppleTV 4K, because there's no way to tell exactly what the source format is, as far as I can tell anyway. I am aware you have to manually set the output if you want to change refresh and between HDR and Dolby Vision. What I don't know is if, when I set the refresh rate to 24Hz, if it's trying to play a 24fps source "natively", or if it's taking a source that should be at 24, upconverting it the preferred Apple format of 60, then downconverting it AGAIN back to 24, which would seem weird, but I could see it possible since Apple seems to really want 60Hz as its ideal setting... If the latter was true, it would mean refresh rate wouldn't matter much and the fastest available should always be used.
Some specific examples to illustrate what I'm trying to determine:
1.) I play a movie I rented or bought on iTunes in 4K w/ Dolby Vision. My choices for Hz are 30, 25, and 24. Now, I know a Blu-Ray would be at 24Hz. So setting it to 24 should result in the least judder. BUT, I can't find confirmation of what the video SOURCE file is. I've seen some things that made me think that everything iTunes sends is at 60 and the AppleTV does the down-conversion, in which case, would setting it to 24 even help in any way? Or am I actually getting a 24fps signal that works with the 24Hz refresh rate optimally?
2.) The same question, but with Netflix. I found a neat trick on the computer to see frames sent if I watch Netflix in a browser. (Try CTRL-ALT-SHIFT-D while watching something.) That seems to indicate ~24fps for the content I tried. Does this equate to the AppleTV app? If so, it would again mean 24Hz is the best setting, but again, I don't know if somehow behind the scenes the AppleTV receives all sources in 60 or converts to that before outputting at whatever refresh rate is chose (I wouldn't think so tho'.).
3.) Basically the same as the other two, but for HDR where 60Hz is available. I know some TV's double the 60 to 120 so 24 can fit in evenly, but does it still make sense to watch at 24 where available, or again, is moot because of what the AppleTV is doing behind the scenes with the source?
4.) For SDR, since the choices are only 50 or 60Hz, is it best to just stick to the 60? Any benefit going for 50 on 24Hz sources like movies, since it's closer to 48?
Thanks for anyone who can shed any light on this! It's driving me nuts!