Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Mabus51

Suspended
Aug 16, 2007
1,366
847
View attachment 720810 Hey ATV 4K fans. Can I ask a probably stupid question? I have the newest ATV (yay!), and a 2016 LG B6 OLED 55 inch TV (also yay!). When it comes to various options available to me in the ATV video and audio section, I haven't yet seen anything that says 4K as an output. The top option for me is 1080p, 60Hz. I know there's a lot of discussion about the 2016 LGs not being able to do certain HDR options at 60Hz, but I'm puzzled why I have NO 4K option on the ATV. I watched a movie last night -- Wonder Woman -- which when I swipe down, shows that it's in Double Vision, so I assume I'm getting 4K from the ATV. But am I? (Movie looked great, but maybe it's 1080p upscaled by the TV to 4K? Can anyone tell me why I don't see any 4K options in the on-screen menu on the ATV? Any advice really welcome! (Image attached of my on-screen menu.)
Did your ATV come in a white box or a black box? The screen you provided looks like the non 4K ATV. The 4K one allows you to run a test on your cables to make sure they are suitable for HDR.
 
  • Like
Reactions: Poontaco

geesus

macrumors 6502
Jul 3, 2015
372
129
I consider a black screen for 1-2 seconds a terrible user experience and that is the best user experience you can hope for changing frequency in the land of TV. Other TVs put up "Searching for Signal" boxes on the screen. Ewww. Worse - start a movie 1-2 seconds, decide not the movie I wanted back 1-2 seconds, start new film 1-2 seconds. It’s an absolutely maddening user experience.

I can assure you, having to change resolution settings every time you go from SDR to HDR to 24Hz, to 60Hz, from HDR to Dolby Vision, is far more maddening a user experience than a 1-2 second pause.
 

palmerc

macrumors 6502
Feb 26, 2008
350
225
Try to get your hands on the very old Samsung H6500 Blu ray or the new Samsung UHD M9500 and tell me if you see any problems when it comes to framerate switching. I haven't any on my devices. Judder doesn't need to be a part of the common user experience these days.

As I said, try 50hz content with your recommended 60hz setting. Is it really a better a experience to have jerky video then a black screen for a second when you start the video? Well, two people, two opinions I guess. We should leave it like that.

I gave away all my Blu-Ray crap about a year ago and now I've been completely Apple TV for about a year. My content comes from Netflix, iTunes, HBO Nordic, YouTube, NRK (national broadcaster), the cable company's app (Get), and a few other apps. I have completely sworn off physical media and anything involving commercials. Oh, and the TV remote is in a drawer with everything handled with Apple's included remote which I like.

"Judder doesn't need to be a part of the common user experience these days."

I understand your point. In matched devices, I'd expect a higher quality experience than just 'two devices that were on sale when I bought them.' The experience of judder is largely one of how you're watching a film - cognitively or emotionally. If judder occurs in the first 5 minutes or in a crappy film, I'm likely to notice. If I'm enthralled by the content, I won't. Channel switching lag, commercials, and frequency switching all occur on a cognitive level and I notice.
 
  • Like
Reactions: Poontaco

kiranmk2

macrumors 68000
Oct 4, 2008
1,529
1,979
So, to summarise:
  • Dolby Atmos is coming soon
  • The box is rendered almost useless by outputting a constant signal an trying to adapt all content to that rather than switching outputs. This leads to improper HDR (trying to HDR-ise SDR content, or converting one HDR format to another) and juddering (e.g. 24 Hz content displayed at 60 Hz).
 
  • Like
Reactions: xsmett

palmerc

macrumors 6502
Feb 26, 2008
350
225
I can assure you, having to change resolution settings every time you go from SDR to HDR to 24Hz, to 60Hz, from HDR to Dolby Vision, is far more maddening a user experience than a 1-2 second pause.

But most people won't ever switch the frequency after the first time. They won't even know that changing frequency might remove judder or even what judder is. They would however notice a long delay in frequency switching or if it messed up. The safe choice is fixed.

And you should never switch off HDR, it is the Apple TVs job to just handle the translation between the two.
 
Last edited:
  • Like
Reactions: Donfor39

geesus

macrumors 6502
Jul 3, 2015
372
129
But most people won't ever switch the frequency after the first time. They won't even know that changing frequency might remove judder or even what judder is. And you should never switch off HDR, it is the Apple TVs job to just handle the translation between the two.

Yeah you should absolutely switch off HDR when watching SDR material. If there was an auto-switching feature, which even the flakey Roku players manage to handle without issue, then all of this would be moot. There is no excuse for it, and I can't believe people are defending Apple....but then it's Apple, and some people believe they can never be wrong. And I say that as someone who owns a MacBook Pro, iPhone, Apple Watch, 2x ATVs and AirPods, so I am as encamped in the Apple ecosystem as anyone.

They have messed up here, be it deliberately or accidentally, and deserve to be called out on it.
 

palmerc

macrumors 6502
Feb 26, 2008
350
225

Overall a positive review, minus of course the usual whining about image handling choices. One aspect of all these reviews is the complaining about the remote. The remote is excellent, and it is certainly much better than anything shipping with a TV.

Like Nilay Patel at The Verge, who didn't even bother to use the remote, these reviewers don't really put any time into using the remote and assume they won't like it based upon limited usage. The old remote is not fiddly, nor are there orientation problems after a few days of regular usage. With the new white circle, orientation is more obvious and removes the learning curve from the remote entirely. In their defence you know these reviewers have complicated multiple input/output setups that they're not prepared to ditch.
[doublepost=1506756449][/doublepost]
Yeah you should absolutely switch off HDR when watching SDR material. If there was an auto-switching feature, which even the flakey Roku players manage to handle without issue, then all of this would be moot. There is no excuse for it, and I can't believe people are defending Apple....but then it's Apple, and some people believe they can never be wrong. And I say that as someone who owns a MacBook Pro, iPhone, Apple Watch, 2x ATVs and AirPods, so I am as encamped in the Apple ecosystem as anyone.

They have messed up here, be it deliberately or accidentally, and deserve to be called out on it.

It isn't about never being wrong. Clearly, there is a fundamental assumption of universality being made - "it works on my TV, so it will work on everyones TV." That is just flatly false. Implementing 'auto-switching' would be anywhere from an ugly experience to broken on TVs. Just start searching for '<device name> auto-switching problems' and you'll find forums where people are trying to work around the problems.

TVs suck (seriously, why can't you just display what the input tells you to and not try to process the image to make it better?!) and I think many people are too kind to the TV manufacturers. If there is a limiting factor to today's Apple TV experience it is brought to you by TV manufacturers not Apple's desire to display a less than ideal picture. In any case, HDMI 2.1 will resolve the issues most people are whining about without needing to re-sync the TV screen to change frequency.
[doublepost=1506757621][/doublepost]
Yeah you should absolutely switch off HDR when watching SDR material. If there was an auto-switching feature, which even the flakey Roku players manage to handle without issue, then all of this would be moot. There is no excuse for it, and I can't believe people are defending Apple....but then it's Apple, and some people believe they can never be wrong. And I say that as someone who owns a MacBook Pro, iPhone, Apple Watch, 2x ATVs and AirPods, so I am as encamped in the Apple ecosystem as anyone.

They have messed up here, be it deliberately or accidentally, and deserve to be called out on it.

SDR is always mapped to HDR either on the input-side or TV-side. There is no reason to prefer relying on the TV to handle or perform the up conversion.
 
Last edited:

geesus

macrumors 6502
Jul 3, 2015
372
129
It isn't about never being wrong. Clearly, there is a fundamental assumption of universality being made - "it works on my TV, so it will work on everyones TV." That is just flatly false. Implementing 'auto-switching' would be anywhere from an ugly experience to broken on TVs. Just start searching for '<device name> auto-switching problems' and you'll find forums where people are trying to work around the problems.

TVs suck (seriously, why can't you just display what the input tells you to and not try to process the image to make it better?!) and I think many people are too kind to the TV manufacturers. If there is a limiting factor to today's Apple TV experience it is brought to you by TV manufacturers not Apple's desire to display a less than ideal picture. In any case, HDMI 2.1 will resolve the issues most people are whining about without needing to re-sync the TV screen to change frequency.

SDR is always mapped to HDR either on the input-side or TV-side. There is no reason to prefer relying on the TV to handle or perform the up conversion.

I'm not sure what TV you own, but I have a ten year old plasma in the house that manages to switch-framerates almost instantly when asked, and my current LG *is* instantaneous, in terms of automatically acting upon being fed a HDR signal and a Dolby Vision signal and back again, so again, there is no excuse for it.

Also you seem to misunderstand the issue with forcing HDR on - do you own a ATV4K? And a calibrated TV? Switch your Apple TV to HDR mode then watch SDR material on it. It looks hideous.
 

palmerc

macrumors 6502
Feb 26, 2008
350
225
I'm not sure what TV you own, but I have a ten year old plasma in the house that manages to switch-framerates almost instantly when asked, and my current LG *is* instantaneous, in terms of automatically acting upon being fed a HDR signal and a Dolby Vision signal and back again, so again, there is no excuse for it.

Also you seem to misunderstand the issue with forcing HDR on - do you own a ATV4K? And a calibrated TV? Switch your Apple TV to HDR mode then watch SDR material on it. It looks hideous.

It isn't about what TV I own. I own several. I also own two Apple TV 4Ks and the previous generation box. We're not troubleshooting my personal choices in the MacRumors forum. This is about the broader world of TVs and what can be expected in the wild. Comparing TVs would be a rabbit hole unto itself.

SDR doesn't look hideous. Out of curiosity... what are you using as input?
 

geesus

macrumors 6502
Jul 3, 2015
372
129
It isn't about what TV I own. I own several. I also own two Apple TV 4Ks and the previous generation box. We're not troubleshooting my personal choices in the MacRumors forum. This is about the broader world of TVs and what can be expected in the wild. Comparing TVs would be a rabbit hole unto itself.

SDR doesn't look hideous. Out of curiosity... what are you using as input?

The ATV is the input. Feeding a LG B6 OLED. You can try it in any app you choose, be it iTunes, Infuse, Netflix et al. Televisions are calibrated to a set standard - when you artificially convert to HDR it throws all these settings out of whack. HDR and SDR are calibrated independently for a reason.
 

palmerc

macrumors 6502
Feb 26, 2008
350
225
The ATV is the input. Feeding a LG B6 OLED. You can try it in any app you choose, be it iTunes, Infuse, Netflix et al. Televisions are calibrated to a set standard - when you artificially convert to HDR it throws all these settings out of whack. HDR and SDR are calibrated independently for a reason.

Netflix and iTunes look perfectly fine on my TV. TV problem? I deliberately didn't buy a 4K OLED. I think OLED has some nice characteristics, but I'm waiting another 1-2 years for the technology to mature and the prices are too high. I also never liked plasma.

Infuse, I don't use, but I suspect this app would do the wrong thing.

The thing is... it wouldn't surprise me if Apple offered some form of auto-switching in some future update. I assume it would be a opt-in feature given the inherent problems. This would make the calibrators happier. However, there is a solid reason why it didn't ship day one, it just isn't something most people will notice and it will introduce problems for people that are not buying the latest top-of-the-line, stupid expensive TV.
 
Last edited:

xsmett

macrumors regular
Nov 1, 2015
193
195
Palmerc, if the SDR to HDR conversion looks really good on all your displays I really would love to know what tv you own, because I would consider to buy it. Thanks.

I have four TV´s and two projectors and it looks like crap on most content. And yes, I know how to calibrate a TV.
 
  • Like
Reactions: cyb3rdud3

palmerc

macrumors 6502
Feb 26, 2008
350
225
Palmerc, if the SDR to HDR conversion looks really good on all your displays I really would love to know what tv you own, because I would consider to buy it. Thanks.

I have four TV´s and two projectors and it looks like crap on most content. And yes, I know how to calibrate a TV.

I wouldn’t presume to suggest you don’t know how to calibrate.

It might help to identify a specific example on Netflix, for example, that we could all test.
 

vipergts2207

macrumors 601
Apr 7, 2009
4,204
9,296
Columbus, OH
SDR is always mapped to HDR either on the input-side or TV-side. There is no reason to prefer relying on the TV to handle or perform the up conversion.

It isn't about what TV I own. I own several. I also own two Apple TV 4Ks and the previous generation box. We're not troubleshooting my personal choices in the MacRumors forum. This is about the broader world of TVs and what can be expected in the wild. Comparing TVs would be a rabbit hole unto itself.

SDR doesn't look hideous. Out of curiosity... what are you using as input?

What are you talking about? Any on-the-fly "mapping" being done by converting SDR to HDR is going to result in the original SDR content being incorrect. There's no magic formula to convert SDR to HDR and attempting to do so means the picture is no longer being seen as intended, i.e. sub-optimal colors and contrast. Just because it may subjectively look good to you doesn't mean it's correct and anyone bothering to calibrate their TV isn't going to want the SDR content converted into fake HDR. I'm going to guess you have your TV set to torch mode. And the reason so many reviews complain about the image handling choices is because it was a stupid choice. You don't have to agree with that, but those who care about the quality of their video will.
 
  • Like
Reactions: cyb3rdud3

geesus

macrumors 6502
Jul 3, 2015
372
129
The annoying thing is we potentially have (arguably) the best media player out there now, with HDR and Dolby Vision support, as well as a working forced 24Hz mode. If you put Amazon on there (if ever) it really will be a winner, but this odd design choice really holds it back.
 
  • Like
Reactions: KBJ55 and cyb3rdud3

cyb3rdud3

macrumors 68040
Jun 22, 2014
3,259
2,018
UK
You are definitely not able to change every aspect of the display. The level of configurability is actually limited to the values that Apple cannot determine through HDMI.

Example - Chroma 4:2:0 vs 4:2:2. If you read their dialog it says 4:2:0 was chosen for compatibility with most TVs, but 4:2:2 would provide better image quality. That is because, they don't know if your TV can handle it unless you tell it.

Auto-switching today is guaranteed to do sub-optimal things.
[doublepost=1506693198][/doublepost]

I consider a black screen for 1-2 seconds a terrible user experience and that is the best user experience you can hope for changing frequency in the land of TV. Other TVs put up "Searching for Signal" boxes on the screen. Ewww. Worse - start a movie 1-2 seconds, decide not the movie I wanted back 1-2 seconds, start new film 1-2 seconds. It’s an absolutely maddening user experience.

Judder is and and will be a part of the common user experience for a bit longer. Most people don't notice this phenomenon. It bugs me too.

The Apple TV is a really great box, that does what people expect across all media types and apps, is available worldwide, and is reasonably priced. It does not solve all of the technical challenges facing TVs.
I really don't understand how you get to that conclusion.

I'm sorry Palmerc but you put out so many untruths with such confidence that it is rather dangerous in my opinion. it is quite clear that you've never heard for Extended Display Identification Data (EDID) because if you had you'd know that it is incredibly easy to determine those values. It is flag number 24 that will come back with the information that you highlighted above.

The 1-2 seconds is highly exaggerated, and something that happens today with any decent UHD player anyway. Sure it may vary slightly from device to device, on my LG OLED it is in near real time.

But a million times better to actually display what is being received oppose to changing it.
 
Last edited:
  • Like
Reactions: KBJ55

priitv8

macrumors 601
Jan 13, 2011
4,036
640
Estonia
Example - Chroma 4:2:0 vs 4:2:2. If you read their dialog it says 4:2:0 was chosen for compatibility with most TVs, but 4:2:2 would provide better image quality.
Another question to ask is, where do you get consumer-targeted UHD sources with 4:2:2 chroma subsampling? In theory, 4:2:2 is better than 4:2:0, indeed. But practically, when your source is 4:2:0 then it will not matter. AFAIK BDA has settled for 4:2:0 for UHD BD standard, so these will be the best quality consumer-level sources around.
 

andrewstirling

macrumors 6502a
May 19, 2015
715
425
I really don't understand how you get to that conclusion.

I'm sorry Palmerc but you put out so many untruths with such confidence that it is rather dangerous in my opinion. it is quite clear that you've never heard for Extended Display Identification Data (EDID) because if you had you'd know that it is incredibly easy to determine those values. It is flag number 24 that will come back with the information that you highlighted above.

The 1-2 seconds is highly exaggerated, and something that happens today with any decent UHD player anyway. Sure it may vary slightly from device to device, on my LG OLED it is in near real time.

But a million times better to actually display what is being received oppose to changing it.

I actually think the moderators should take some action. His ‘opinions’ are so removed from fact that, at best they’re misinformation, at worst they’re lies.
 
  • Like
Reactions: cyb3rdud3

AVBeatMan

macrumors 603
Nov 10, 2010
5,726
3,624
Anyone had the problem of the subtitles disappearing when watching Narcos via Netflix? It keeps going off?
 

AVBeatMan

macrumors 603
Nov 10, 2010
5,726
3,624
Still happening on mine. Here’s a short video to show what I mean. No subtitles, exit out, back in and they’re back. Happens every 5 minutes or so.

 

palmerc

macrumors 6502
Feb 26, 2008
350
225
Another question to ask is, where do you get consumer-targeted UHD sources with 4:2:2 chroma subsampling? In theory, 4:2:2 is better than 4:2:0, indeed. But practically, when your source is 4:2:0 then it will not matter. AFAIK BDA has settled for 4:2:0 for UHD BD standard, so these will be the best quality consumer-level sources around.

Yes, less compression is in theory better, but you likely won't notice the difference in anything that moves. The problem with any of this, is that it is difficult to make simplifying statements. It is also important to note that ATV isn't a Blu-ray Disc. :) Apple's approach (thus far) is to set your ATV to output the best your TV can handle and they'll do the right thing.
[doublepost=1506846525][/doublepost]
I really don't understand how you get to that conclusion.

I'm sorry Palmerc but you put out so many untruths with such confidence that it is rather dangerous in my opinion. it is quite clear that you've never heard for Extended Display Identification Data (EDID) because if you had you'd know that it is incredibly easy to determine those values. It is flag number 24 that will come back with the information that you highlighted above.

The 1-2 seconds is highly exaggerated, and something that happens today with any decent UHD player anyway. Sure it may vary slightly from device to device, on my LG OLED it is in near real time.

But a million times better to actually display what is being received oppose to changing it.

I think you're simplifying the common case to an absurd extent. My central contention here is that some TVs will be OK while others won't. The market is a mixed bag and given Apple's tendencies they have tended to choose consistency over configurability. This has never pleased a small segment of the market, but it has tended to work out for most people. I think you're trying to take what are technical challenges or shortcomings and blow them up to be epic fails, but in reality, only a handful of people care and they should accept their opinion is fringe. For everyone else, watching a 4K movie on their Apple TV that is only 95% of its potential will be delightful.

Yes, I'm familiar with EDID.
 
Last edited:

Patrick Turner

macrumors newbie
Sep 30, 2017
23
8
What is the point in outputting 24Hz, when most TVs will frame convert it to 120Hz or 60Hz anyway?

Setting my ATV to a 60Hz output, judder performance is the same as 24Hz.(In other words, great). I think where the problem may arise is when setting to 50Hz, but there is really no reason for that unless using an European streaming App such as BBC iplayer.
 
  • Like
Reactions: palmerc

palmerc

macrumors 6502
Feb 26, 2008
350
225
What are you talking about? Any on-the-fly "mapping" being done by converting SDR to HDR is going to result in the original SDR content being incorrect. There's no magic formula to convert SDR to HDR and attempting to do so means the picture is no longer being seen as intended, i.e. sub-optimal colors and contrast. Just because it may subjectively look good to you doesn't mean it's correct and anyone bothering to calibrate their TV isn't going to want the SDR content converted into fake HDR. I'm going to guess you have your TV set to torch mode. And the reason so many reviews complain about the image handling choices is because it was a stupid choice. You don't have to agree with that, but those who care about the quality of their video will.

The statement "those who care about the quality of their video" is elitist. At the very least it needs to be qualified with "above all other considerations." The idea that each and every setting should be artisinally selected and every media purchase hand curated is beyond elitist it is ridiculous. You have taken consumerism and your choices to be a reflection While you might think that there is one true way to watch video, most people care about quality to the extent that it doesn't require spending excessive amounts of time and energy getting it right. They want to be moving toward a better experience, not necessarily important to get the absolute best experience available at any given time.

That said, I have spent time calibrating my TV.

Mapping SDR to HDR is going to lead to incorrect colors, not least of all because you must make simplifying assumptions. Still a 10-bit color scheme can map every color in 8-bit color, so it can achieve good enough for most, but for the person that is willing to invest time and energy in calibration... you're going to be unhappy.
[doublepost=1506849149][/doublepost]
I actually think the moderators should take some action. His ‘opinions’ are so removed from fact that, at best they’re misinformation, at worst they’re lies.

Let me say, how I could be incorrect. For the record, some believe there is only one _true_ way to watch TV, with a carefully calibrated TV set to the native frame rate of the input source matched to their TV for the color scheme set exactly to SDR or HDR depending on the original source material. That all TV hardware must be carefully selected, matched and media carefully curated such that the optimal experience is maintained at all times. If this is true, then I'm wrong.

That position is certainly a valid viewpoint. I'd consider it elitist. I'd say the obsession with perfectly calibrated viewing experience is unnecessary and not what most people do. And yet, regular people can still enjoy the fruits and experience of new technology. I believe Apple TV as it stands is handling this in such a way that most people won't notice and actually it'll depend in large part on the particular model of TV you own and the kinds of material you're consuming be it, Plex, iTunes, Netflix, HBO, YouTube or whatever. Content providers provide a much greater level of influence on image quality than the Apple TV.

If you believe that auto-switching can be done today on all TVs without problems, again, I'm wrong. My contention is that this will be taken care of correctly in HDMI 2.1 and that the current method on devices like Roku provide a crap experience. Apple is quoted as saying to The Verge the auto-switching experience is 'inelegant.' So, I guess if I'm wrong, at least I'm in good company.
 
Last edited:
  • Like
Reactions: Poontaco

-Gonzo-

macrumors 65816
Nov 14, 2015
1,445
786
What is the point in outputting 24Hz, when most TVs will frame convert it to 120Hz or 60Hz anyway?

.
Surely that all depends on if your TV accepts 24Hz?
When I put a Blu-ray film in my player my Panasonic auto switches and shows it’s playing in 24Hz so setting the ATV to output the same for films makes sense doesn’t it.
fff38358c7eac7a97a96cf39bdbf0411.jpg
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.