New Apple dongle supports 4k HDR

Discussion in 'Apple TV and Home Theater' started by Cave Man, Aug 9, 2019.

  1. Cave Man macrumors 604

    Cave Man

    Joined:
    Feb 12, 2007
    Location:
    Neander Valley, Germany; just outside Duesseldorf
    #1
    Ars Technica is reporting that Apple has updated its A/V dongle for certain Macs to do 4k HDR video. 60 Hz on all newer Macs other than the mini where it is limited to 30 Hz. I didn't see anything about DTS:X or Atmos passthrough, though.
     
  2. FarmerBob macrumors regular

    FarmerBob

    Joined:
    Aug 15, 2004
    #2
    I don't find HDR all it's said to be cracked up to be. 9 times out of 10 I turn it off and the flat blown out picture comes to life. But then some movies and shows HDR is the only way to watch them.
     
  3. cynics, Aug 10, 2019
    Last edited: Aug 10, 2019

    cynics macrumors G4

    Joined:
    Jan 8, 2012
    #3
    What model TV do you have? There is generally a specific setting(s) that is fairly accurately calibrated (generic) on most modern TVs.

    HDR is a higher bit depth, wider luminance and color gamut aka its an expansion of the information SDR. Absolute worst case you can make it look similar to SDR however even on a cheap HDR panel you can usually get a better image. However if your display is trying to show you HDR information through SDR settings it will generally look dark and/or washed out. This is because you are seeing the whites and blacks that were previously clipping due being outside SDR.

    This is a picture from my iPhone of the TV (to visualize the difference) displaying SDR and HDR versions of the same content. Both are generically calibrated.

    SDR

    IMG_6268.jpg

    HDR

    IMG_6269.jpg

    Now I will say that in real life the SDR looks washed out and the HDR does not look overly saturated like the above pics might appear, keep in mind its a pic from my iPhone of my TV.

    Sometimes you can find generic calibration for your TV here.

    https://www.rtings.com/tv/

    Also make sure you HDMI UHD is turned on. Since this isn't an automatic input option you'll need to turn it on for specific HDMI ports manually. That is the setting that lets the TV know that you are connected to an HDR source.
     
  4. waw74 macrumors 68040

    Joined:
    May 27, 2008
    #4
    Range matching is your friend.
    processed HDR is bad, when the aTV "upscales" SDR into HDR.

    If you have the aTV in HDR mode, but not range match mode, then any SDR content is processed by the aTV and turned into HDR, with normally less than desirable results. Blown out is how I would describe it.

    put the aTV into 4K SDR mode, and turn on range and frame rate matching, then SDR content is played SDR, and when you play HDR content the aTV will change to HDR mode.

    You could put the aTV in HDR mode, with range matching, and achieve the same affect. (it will switch its output to SDR when you play something that is SDR)

    The main reason I keep mine in SDR, is every time you change the video signal type, the TV will black out for a second or 2, At least for me it's a longer blackout when changing range, and since most of my stuff is SDR, I encounter that blackout less if I have the aTV default to SDR.
    Plus it helps to make sure HDR actually kicks in, since my TV has a small popup when it changes to HDR or Dolby vision.
     
  5. GrumpyCoder macrumors regular

    Joined:
    Nov 15, 2016
    #5
    This is only true in marketing terms. Depending on what display you have and how restricting the firmware is, you can easily set it to SDR and have P3 or BT2020 color space with >100 nits. Its been done for ages, when the term HDR wasn't even a thing. Sony did a home theater demo with a 12-Bit version of Resident Evil running off a Dolby rig if I remember correctly around 15 years ago. What you show in the screenshots is just a matter of processing limited by the display. Add a dedicated video processor into the mix and you're all set.
     
  6. cynics macrumors G4

    Joined:
    Jan 8, 2012
    #6
    Semantics if you ignore the context. I'm referring to HDR10 and Dolby Vision video sources as they apply to the Apple TV 4k compared on the same display.

    Outside of context HDR is meaningless. Its a term that's been used and patented on since I was in school in the 90's and as a technique thats been used for 150+ years in photography.

    Of course my display is limited by its....limits. Its an LCD, right out of the box it has the inherent contrast limitation. The point is an HDR10/DV television can process and map HDR10/DV sources and once accounted for will give you a better image.
     
  7. priitv8 macrumors 68040

    Joined:
    Jan 13, 2011
    Location:
    Estonia
    #7
    Nonetheless, it is important to understand, that the approach taken with HDR in photography and video worlds is exactly opposite.
    In photo, we compress the larger dynamic range of a scene into the smaller range of display device (multi-exposure blending).

    In cinema, we enlarge the dynamic range of the display device itself, giving more headroom to the content.
     
  8. cynics macrumors G4

    Joined:
    Jan 8, 2012
    #8
    No it's not. For example exposure blending is what the RED cameras HDRx feature does for HDR footage. I thought it was a feature and how it functioned on a couple DSLR's in recording modes. That is the difference between native and "in camera" HDR.

    Still outside the context of this thread anyway. HDR is generic, potentially meaningless when it comes to the AppleTV. Actually high quality HDR10/DV videos are coming from 12 bit log that have been graded by someone that knows what they are doing.
     

Share This Page

7 August 9, 2019