Photos in High Sierra: Is video export hardware encoded or by software?

Discussion in 'macOS High Sierra (10.13)' started by EugW, Jun 26, 2017.

  1. EugW macrumors 68000

    EugW

    Joined:
    Jun 18, 2017
    #1
    Currently in Photos in Sierra 10.12, when you drag a 2160p iPhone 4K h.264 video out of Photos, it transcodes it to 2160p h.264. Despite the format being the same, it does this to reduce the file size I guess. For h.264 it's not necessary IMO, but that is the default behaviour, and it can bring even my iMac 4.2 GHz to its knees. Well, not quite, but the way the interface works it almost feels like the machine has hung, because it doesn't tell you it's transcoding. I am assuming this is a software-based encode.

    To get around this you can export the original (which is a two step process).

    But I'm curious. What does it do on High Sierra? Does it do hardware encoding for such 2160p transcodes? Or is it still software h.264 encoding? Or is the default now HEVC and it does h.265 encoding instead? h.265 would be even more taxing if they didn't utilize hardware encoding.

    In the very least I would hope that Apple adds a message indicating Photos is transcoding for export, so people don't think their computers have crashed.
     
  2. jsawy3r macrumors member

    Joined:
    Jun 3, 2014
    #2
    I think it's all hardware on High Sierra. I just moved a H.265 and a H.264 video from Photos to the Desktop and in both instances CPU usage never went above 25% (of 800%). Both files were coded to H.264 and I tested this with Intel 530 and AMD 460. No difference in CPU usage.

    Hope that helps! Never checked what usage was like on Sierra, so I can't give you a before data set.
     
  3. EugW thread starter macrumors 68000

    EugW

    Joined:
    Jun 18, 2017
    #3
    Thanks. What machines?
     
  4. jsawy3r macrumors member

    Joined:
    Jun 3, 2014
    #4
    2016 MBP 2.9Ghz Radeon Pro 460.

    I've also run the Sony camp videos in QT. 8-bit HEVC runs flawlessly with minimal CPU usage. 10-bit HEVC is a stutter fest, but oddly the CPU usage only goes up to about 200% (I would expect the CPU to try a little more...). I know SkyLake only has 8-bit HEVC HW decode, but the Pro 460 has 10-bit HW decode and I couldn't get it to run smoothly even when forcing discrete GPU usage. My guess is that QT is only using the CPU even when gfxCardStatus is forcing discrete usage. Hopefully Apple gets the 2016s to use the Radeon to decode 10-bit, but I'm guessing they just want us to get newer laptops...
     
  5. EugW thread starter macrumors 68000

    EugW

    Joined:
    Jun 18, 2017
    #5
    That's awesome info, thanks. How's the quality?

    Apple specifically stated that 10-bit HEVC decode will require Kaby Lake / 7th gen Intel Core. I suspect they just can't be bothered to make a GPU-specific implementation at this point. They just have to optimize for one target and be done with it. I doubt they'll ever bring it to the Skylake models unfortunately.

    Anyhow it's great that these Skylake models and Kaby Lake models get hardware encode if true, for two reasons:

    1) I suspect the speeds would be in the same ballpark regardless of CPU, in the same class that is. Or would it?

    2) No fans going crazy.
     

Share This Page