Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacBH928

macrumors G3
Original poster
May 17, 2008
9,019
4,097
So this HVEC/H.265 has been around for sometime promising higher quality at smaller sizes but I have hardly seen anything using it or supporting it. H.264 still king as it seems. Last I heard there is a new format coming soon which is open source and free called AV1 but even this seems rare.

Any one know what going on? Surely smaller sizes means smoother streaming and less storage.
 
H.265 is a lot of places already. Pretty certain 4K Blu Ray discs use it. A lot of YouTubers export and upload in h.265, though I'm not actually sure what YouTube itself stores or sends out in. A lot of cameras can write h.265 files, and if I'm not mistaken iPhone video can be too, but I'm not entirely certain on that one.
I also use it for uploading videos to my own web server; So it's definitely in use I can tell you that much.

AV1 looks like a good codec, but HEVC will continue to gain traction and popularity, in part because of just how much of the market has hardware acceleration for it
 
  • Like
Reactions: SteveW928
Youtube uses H264 and VP9 as far as I found out by clicking stats for nerds on their videos. The codec exists but somehow h264 still remains king although h265 is the better quality smaller sized one.
 
HEVC is used in DVB-T2, and by every streaming service (expect Youtube) to deliver 4k HDR, so Netflix, Amazon Prime, AppleTV+, etc…
 
HEVC is used in DVB-T2, and by every streaming service (expect Youtube) to deliver 4k HDR, so Netflix, Amazon Prime, AppleTV+, etc…

I don't think they stream to all devices in HEVC though, some devices probably can't decode the HEVC because it needs a hardware decoder.
 
I don't think they stream to all devices in HEVC though, some devices probably can't decode the HEVC because it needs a hardware decoder.

h.265 widely supported on modern streaming devices. Even some older devices support some h265 with some caveats. To your earlier question, yes many services use h.265. For example if many people stream 4K from your services, you will want to save bandwidth costs.
 
AV1 is already here, as there are consumer products that support it, and services that stream in it. It is early for it, but it is here, and it will likely become the de facto standard for many years.
 
I don't think they stream to all devices in HEVC though, some devices probably can't decode the HEVC because it needs a hardware decoder.
Of course, they use it only when the device supports it.

AV1 widespread usage is still years away.
 
I don't think they stream to all devices in HEVC though, some devices probably can't decode the HEVC because it needs a hardware decoder.
It's supported on the $20 Amazon Fire Stick, which will undoubtedly see a huge pick up in its use. Also, all but the Chinese own brands of Smart TV's have supported it natively since the 2014 model year - even the Chinese brands were supporting it by 2017.

Windows 10 had support from release, and Intel introduced hardware encoding with Skylake.

Basically, the likelihood of wanting HEVC support and not having the capability to is fast becoming unlikely - I think realistically the Mac community is the most likely to have devices that won't have support due to the propensity to hold on to ancient hardware.
 
It's supported on the $20 Amazon Fire Stick, which will undoubtedly see a huge pick up in its use. Also, all but the Chinese own brands of Smart TV's have supported it natively since the 2014 model year - even the Chinese brands were supporting it by 2017.

Windows 10 had support from release, and Intel introduced hardware encoding with Skylake.

Basically, the likelihood of wanting HEVC support and not having the capability to is fast becoming unlikely - I think realistically the Mac community is the most likely to have devices that won't have support due to the propensity to hold on to ancient hardware.

I’m sure my trusty iMac G3 can software decode it...
 
  • Like
Reactions: j26
AV1 is already here, as there are consumer products that support it, and services that stream in it. It is early for it, but it is here, and it will likely become the de facto standard for many years.

I wish, I am huge proponent of open source and free standards, and video was one of the last things that didn't have that even Wikipedia uses .ogg or something like that. Albeit last I heard AV1 had major issue with encoding time, it was some insane thing that it will take like 40hrs to encode what HEVC can encode in 2 or something like that. Also I am not sure about the hardware decoding part, is it supported on modern hardware?
 
I wish, I am huge proponent of open source and free standards, and video was one of the last things that didn't have that even Wikipedia uses .ogg or something like that. Albeit last I heard AV1 had major issue with encoding time, it was some insane thing that it will take like 40hrs to encode what HEVC can encode in 2 or something like that. Also I am not sure about the hardware decoding part, is it supported on modern hardware?
Nvidia GeForce RTX 30 series GPUs, AMD RDNA2 series GPUs, many 2020 TVs and most 2021 TVs, Intel 11th generation CPUs, Apple M1 SOCs all have AV1 hardware decoding.

As for the encoding, AV1 is much slower than for instance x264 and x265, but that will likely improve over time. Not to the same level, but it will get better. Fortunately encoding is only done once. :)
 
As for the encoding, AV1 is much slower than for instance x264 and x265, but that will likely improve over time. Not to the same level, but it will get better. Fortunately encoding is only done once. :)
Unless QC finds problems, or wants to make changes in the final release, and you get to encode over and over again :) Granted, larger production shops are used to dealing with long encode timelines.
 
Here is a question, if HEVC is compressed and it is sold as a glorious 4K, why do they shoot in RAW in the first place? Why not shoot in HEVC and just avoid the whole conversion thing?

Nvidia GeForce RTX 30 series GPUs, AMD RDNA2 series GPUs, many 2020 TVs and most 2021 TVs, Intel 11th generation CPUs, Apple M1 SOCs all have AV1 hardware decoding.

As for the encoding, AV1 is much slower than for instance x264 and x265, but that will likely improve over time. Not to the same level, but it will get better. Fortunately encoding is only done once. :)

well that is pleasing to hear but all are recent hardware which means considering AV1 as a standard will be years away. From the perspective of producers, why create a HEVC or AV1 when H.264 will work on anything. Wait some years then switch to AV1 when everyone has devices that support that.

Unless QC finds problems, or wants to make changes in the final release, and you get to encode over and over again :) Granted, larger production shops are used to dealing with long encode timelines.

Its a problem for every non professional. I am thinking of all the apps on iOS and people uploading to youtube. Why use a format that takes 2 hours if the other will take just 15 minute. Then you have the digital library, how long will it take to convert the whole Netflix library. 1 show that is Friends has like 240 episodes, how many hours just for that 1 show.

Then we have the philosophical argument, why switch to AV1 when H.264 is more than good enough.
 
Then we have the philosophical argument, why switch to AV1 when H.264 is more than good enough.
Why switch to DVD when VHS is more than good enough? Why switch to Blu-Ray when DVD is more than good enough?

Besides it's also not just about quality. It's about quality relative to file size. A smaller sized file producing the same quality means Netflix can pack more shows on any given storage drive size reducing the cost of their server installations. And frankly they can afford getting a custom ASIC made that can encode/transcode to AV1 in no time at all, and it would still be a cost benefit to them as it would overall save on hardware costs in the long term; Assuming it's a format that can be pushed to their customers, requiring customers can decode it at speed and low power consumption, like x.264 and x.265 currently
 
  • Like
Reactions: T'hain Esh Kelch
Here is a question, if HEVC is compressed and it is sold as a glorious 4K, why do they shoot in RAW in the first place? Why not shoot in HEVC and just avoid the whole conversion thing?
One shoots RAW for the highest quality source material and edits that way. This preserves the quality as much as possible.

If you shoot with lossy compression every time you edit you introduce additional image degration.

There's an old computer science acronym for this: GIGO ("Garbage In, Garbage Out").

When you shoot and edit RAW the final product is still the same quality then converted into various formats based on specific usage case.

Big Movie Studio may be shooting 8K RAW on $200,000 movie cameras with $50,000 lenses and they will ultimately output different formats. High resolution files for movie theater projection, 4K files for 4K Blu-ray and 4K streaming, 1080p files for legacy Blu-ray/ordinary streaming/etc., 480p for DVDs/low end streaming, etc.

This isn't specific to digital videography. Still digital photographers have been shooting and editing RAW for years and years for the same reason. The audio industry has been doing uncompressed digital audio capture for decades.
 
Last edited:
  • Like
Reactions: navier and MacBH928
Why switch to DVD when VHS is more than good enough? Why switch to Blu-Ray when DVD is more than good enough?

Besides it's also not just about quality. It's about quality relative to file size. A smaller sized file producing the same quality means Netflix can pack more shows on any given storage drive size reducing the cost of their server installations. And frankly they can afford getting a custom ASIC made that can encode/transcode to AV1 in no time at all, and it would still be a cost benefit to them as it would overall save on hardware costs in the long term; Assuming it's a format that can be pushed to their customers, requiring customers can decode it at speed and low power consumption, like x.264 and x.265 currently
Well, VHS->DVD is noticeably better but I doubt anyone can look at a Netflix film and tell me which codec it is. While smaller sizes are save storage, video providers don't seem to take advantages of it as they seem to have stored multiple versions of the file in 720p, 1080p, 4k, and sometimes in multiple codecs. This is not saving space, this is adding storage.

Whatever it is, I am just glad we finally have an open-source free video standard. I hope AV1 is not inferior in quality to other standards today or future.
 
Well, VHS->DVD is noticeably better but I doubt anyone can look at a Netflix film and tell me which codec it is. While smaller sizes are save storage, video providers don't seem to take advantages of it as they seem to have stored multiple versions of the file in 720p, 1080p, 4k, and sometimes in multiple codecs. This is not saving space, this is adding storage.

Whatever it is, I am just glad we finally have an open-source free video standard. I hope AV1 is not inferior in quality to other standards today or future.

To reach a broader market you will need to have several deliverables so you can target the client's capabilities. But it will still save space to, as an industry, switch to better compressed codecs. Over time, there will be a point where older codecs can be dropped in favour of newer ones. Or higher resolutions can be limited to more efficient codecs
 
But it will still save space to, as an industry, switch to better compressed codecs. Over time, there will be a point where older codecs can be dropped in favour of newer ones. Or higher resolutions can be limited to more efficient codecs
We have already seen this in our lifetimes -- even young master casperes1996.

If you watch a 480p streaming video today, it is not MPEG-2 like early DVDs.

These video codecs are strongly correlated to playback resolutions. h.264 is 1080p more or less. h.265 is 4K more or less. AV1 is the favored codec for 8K.

The OP needs to understand that it's also in the streaming service provider's best interest to select the content encoded with the best codec for a given device ("hi, I'm a five-year-old Model X streaming stick supporting 1080p and h.264 hardware decoding" versus "hi, I'm a 2020 smartphone supporting 4K Dolby Vision" versus "hi, I'm a high-end Windows 10 PC with an Nvidia Ampere GPU supporting 8K and AV1 hardware decoding"). After all, the streaming service pays for its network connection.
 
Last edited:
Well, VHS->DVD is noticeably better but I doubt anyone can look at a Netflix film and tell me which codec it is.
You're not supposed to. Well, not Joe Consumer anyhow.

That would be like listening to a song and then telling the artist "that's a sweet Model Q microphone you used but you should really use Brand Y cables."
 
Last edited:
We have already seen this in our lifetimes -- even young master casperes1996.

If you watch a 480p streaming video today, it is not MPEG-2 like early DVDs.

These video codecs are strongly correlated to playback resolutions. h.264 is 1080p more or less. h.265 is 4K more or less. AV1 is the favored codec for 8K.

The OP needs to understand that it's also in the streaming service provider's best interest to select the content encoded with the best codec for a given device ("hi I'm a Model X streaming stick supporting 1080p and h.264 hardware decoding" versus "hi I'm a new smartphone supporting 4K Dolby Vision" versus "hi I'm a high-end PC supporting 8K and AV1 hardware decoding"). After all, the streaming service pays for its network connection.

Indeed. Excellent points :)
 
To reach a broader market you will need to have several deliverables so you can target the client's capabilities. But it will still save space to, as an industry, switch to better compressed codecs. Over time, there will be a point where older codecs can be dropped in favour of newer ones. Or higher resolutions can be limited to more efficient codecs

Sure in that future, but that future is far away. I am not saying NOT to move on, all I am saying is that that step is not right now as many devices do not support AV1 natively and maybe H.265.

How far do you think they can continue to make more efficient codecs? Its interesting because the more we can make larger storage for cheaper and increase transfer speed the smaller sized the files can be. Unfortunately we won't see too much of that transition because H.264 was released 17 years ago and still the most ubiquitous. Depending on your age, you probably have 2 more codecs to see on of which is AV1. (40 years).

The OP needs to understand that it's also in the streaming service provider's best interest to select the content encoded with the best codec for a given device ("hi, I'm a five-year-old Model X streaming stick supporting 1080p and h.264 hardware decoding" versus "hi, I'm a 2020 smartphone supporting 4K Dolby Vision" versus "hi, I'm a high-end Windows 10 PC with an Nvidia Ampere GPU supporting 8K and AV1 hardware decoding"). After all, the streaming service pays for its network connection.

I was making the point that providers do not benefit from the higher efficiency codecs storage wise because now they have to store in h.264+h.265+AV1 versions plus you have all that encoding time to do which I am not sure how much will it take to re-encode the whole library.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.