Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Metadata

H.264 isn't based on MPEG-2 in other sense then they're both video compression technologies.

There is no GPU acceleration of MPEG-2 decoding and there never has been, its not needed at all. My old 333 mhz Pentium II used to encode MPEG-2 in real time, much less decode.

That isn't true either: it's like saying reading English written by me is harder to read than English written by you if we're both selecting from the vocabulary. It's just flat out impossible, though it is possible that you ask me/the encoder to use more complex words (compression) that is harder to decode.

EDIT: Turns out I'm wrong, there are generic video decoding wrappers for GPUs that support MPEG-2, but that's not their primary purpose, any GPU decoding work nowadays is done solely for accelerating H.264, though they may use a more generic abstraction.

Everything the original poster said pretty much just blew right by you. He was saying that hardware acceleration of video is not a new concept and Apple has been late getting on the ball with H.264.

Yes H.264 is a standard and is/should be pretty much the same no matter what program makes the file. But still, Quicktime is using SOME metric to determine whether a specific H.264 video file should be decoded in hardware or software. That is what the original poster was talking about. (It even says so in the article summary!) It's likely metadata that flips the magical switch if anything.
 
We assume the new MacBook Air would share the same advantage as it is based on the same graphics chipset as the new MacBook, though we haven't heard from an Air owner yet.

That's going to be hard considering the new macbook air won't be released until early November
 
But still, Quicktime is using SOME metric to determine whether a specific H.264 video file should be decoded in hardware or software. That is what the original poster was talking about. (It even says so in the article summary!) It's likely metadata that flips the magical switch if anything.

I would keep an eye on the Handbrake developers for this (they make an app for converting DVDs to MP4 files). IIRC, they discovered the metadata 'atom' that allows Apple to pass Dolby Digital through h264 files. I'm sure they'd like to put an 'Allow GPU decoding' tickbox amongst their encoding options, so watch that space.
 
Everything the original poster said pretty much just blew right by you. He was saying that hardware acceleration of video is not a new concept and Apple has been late getting on the ball with H.264.

Yes H.264 is a standard and is/should be pretty much the same no matter what program makes the file. But still, Quicktime is using SOME metric to determine whether a specific H.264 video file should be decoded in hardware or software. That is what the original poster was talking about. (It even says so in the article summary!) It's likely metadata that flips the magical switch if anything.
What he said hardly "blew right by me" ;) I've spent about 8 months researching video encoding/decoding for my CS degree, especially H.264 based decoding/encoding.

Hardware acceleration for H.264 *is* a new concept, his/her tortured analogy to non-existent MPEG-2 (nothing like H.264) GPU decoding aside. If you spend a few months looking into video encoding/decoding, you'll see there are very very few GPU based solutions, and the only viable one right now is DXVA 2.0 for Windows or a new solution based on libavcodec from the ffmpeg project that uses CUDA, which isn't compiled for Mac/Linux yet. This is the first and the only GPU decoding solution available for Mac, this is hardly Apple playing catch-up. They've already had a GPU based solution on the market for some time now: the Apple TV.

The poster wasn't talking about magical metadata either...he was talking about the encodes that he has seen that he thinks are encoded with ffmpeg (which is impossible, he's talking about ffmpeg using libx264 I assume) are more complex than the H.264 streams encoded by Quicktime.

Here's an excellent article about the "metadata" you refer to: it's more just basic information about the video stream. http://rob.opendot.cl/index.php/useful-stuff/h264-profiles-and-levels.
 
I would keep an eye on the Handbrake developers for this (they make an app for converting DVDs to MP4 files). IIRC, they discovered the metadata 'atom' that allows Apple to pass Dolby Digital through h264 files. I'm sure they'd like to put an 'Allow GPU decoding' tickbox amongst their encoding options, so watch that space.
I really, really doubt that Apple re-encoded all their trailers to enable GPU decoding that they haven't even announced publicly yet. Quicktime is almost definitely analyzing the stream to decide whether to play it or not, unfortunately I have neither of these laptops to test this.
 
What he said hardly "blew right by me" ;) I've spent about 8 months researching video encoding/decoding for my CS degree, especially H.264 based decoding/encoding.

Hardware acceleration for H.264 *is* a new concept, his/her tortured analogy to non-existent MPEG-2 (nothing like H.264) GPU decoding aside. If you spend a few months looking into video encoding/decoding, you'll see there are very very few GPU based solutions, and the only viable one right now is DXVA 2.0 for Windows or a new solution based on libavcodec from the ffmpeg project that uses CUDA, which isn't compiled for Mac/Linux yet. This is the first and the only GPU decoding solution available for Mac, this is hardly Apple playing catch-up. They've already had a GPU based solution on the market for some time now: the Apple TV.

The poster wasn't talking about magical metadata either...he was talking about the encodes that he has seen that he thinks are encoded with ffmpeg (which is impossible, he's talking about ffmpeg using libx264 I assume) are more complex than the H.264 streams encoded by Quicktime.

Here's an excellent article about the "metadata" you refer to: it's more just basic information about the video stream. http://rob.opendot.cl/index.php/useful-stuff/h264-profiles-and-levels.


Nice reply.

I hate getting snarky replies back as well. We're all here posting what we know. Even if someone "p0wned" someone else in terms of what they know about a topic, doesn't mean we should rub it in their face.

Play nice, kids. :)

Let's hope this feature will be enabled on older hardware as well. My trusty 2007 MBP would like it :)

I'm sorry, but judging from Apple's history, they won't. ;)

However, judging from Apple's more recent history, they may be more likely to succumb, as they're a big company now, and many eyes watch, and criticize, them in blogs and the media.
 
Has Cringely finally been proven right?

Robert X Cringely (PBS) predicted a year and a half ago that Apple would add hardware H.264 decoding and encoding to its entire product line, which would give them the ability to encode video to H.264 in real time (think Tivo in your Mac). This March, he predicted that Apple wasn't rushing to support Blu-Ray because H.264 support and faster connections would make it feasible to sell 1080p HD content through iTunes. And in August, he predicted that the "product transition" that was going to cut into profits a bit this quarter would involve H.264 support.

I think the profit hit came more from the new manufacturing process for the aluminum laptops... but his original rumor from March 2007 might be panning out.
 
H.264 isn't based on MPEG-2 in other sense then they're both video compression technologies.

Simply wrong. See Wikipedia. H.264 is simply MPEG-2 plus some obvious corrections for artifacts that I see daily in American MPEG-2 HDTV broadcasts.

There is no GPU acceleration of MPEG-2 decoding and there never has been, its not needed at all. My old 333 mhz Pentium II used to encode MPEG-2 in real time, much less decode.

I'll be sure to reconfigure my Mac MythTV player to stop using the GPU acceleration then. Perhaps I'll ask they they stop referring to it as "the undocumented acceleration used by DVD Player". :)

That isn't true either: it's like saying reading English written by me is harder to read than English written by you if we're both selecting from the vocabulary.

You're telling me something isn't true based on your understanding of H.264, which I don't particularly disagree with. Regrettably, I've been working on a Video Transcoding application which uses Quicktime's H.264 decoder so I've found out the hard way that this understanding is entirely in error. I suspect Apple have placed some optimizations in for handling H.264 when encoded with their preferred set of parameters and sizing which they generate in Quicktime.
 
re. .264 encoding - is this GPU utilization something utilities like handbrake must build into their app or is it automatically enabled via the OS?

I ripped a DVD last night with handbrake with my new 2.4 MB and it took pretty much the same time as my 2.0 iMac!
 
What he said hardly "blew right by me" ;) I've spent about 8 months researching video encoding/decoding for my CS degree, especially H.264 based decoding/encoding.

I've been working as an Engineer in the Video Recording industry for 23 years.... including the first-ever compressed video tape recorder.

And for the innocent bystanders, keep in mind that another name for MPEG-2 is H.262. In other words, H.264 is intentionally the "next generation" of MPEG created by essentially the same committe. Just as they did not throw away MPEG-1 when they created MPEG-2, they did not entirely throw away MPEG-2 when they created H.264. Though the precise details of the coding are different than MPEG-2, they are not wildly different compared to other formats like DV or ProRes.

Hardware acceleration for H.264 *is* a new concept, his/her tortured analogy to non-existent MPEG-2 (nothing like H.264) GPU decoding aside.

Here's some documentation on the "non-existent" acceleration: http://www.defyne.org/dvb/accellent.html
 
re. .264 encoding - is this GPU utilization something utilities like handbrake must build into their app or is it automatically enabled via the OS?

Handbrake does not use the Quicktime libraries (nor any other Apple-created libraries) to perform video encodings. Since Handbrake is not relying upon Apple to help out with the encode, nothing Apple does to the OS will affect Handbrake's encode performance.

On the other hand, if Apple were to create an API to assist with H.264 encodes, Handbrake could be modified to take advantage of it. But for now, my understanding is that we are speculating about accelerated decode, not encode.
 
I've found some info about this on NVIDIA's website. Their term for this tech is PureVideo HD, and is present on GeForce 8400, 8500, 8600 desktop GPUs and 8400M & 8600M mobile GPUs. They also mention it on the 9400M page (as seen in the new MacBook) and 9600M page (as seen in the new MacBook Pro).

This would at least imply that the GPUs in the old MacBook Pro (8600M GT) and some Mac Pros (amongst others) have hardware capable of H.264 hardware decode, it remains to be seen if Apple add support for these other chips...
 
What about encoding?

Is it possible to speed up H.264 encoding using the GPU as well - or is that a CPU-only task?


All I know is on a PB G4 867 PPC 10.4.11 I cannot play a HD video I took myself essentially at all.
On a MacBook 2006 Intel it barely plays. Somewhat choppy at times. Not h264 in either case.
It seems we are only barely getting hardware up to the task of playing common modern TV...

My PSP (Playstation Portable) plays H.264 encoded video without any problems (it's a wonderful machine for taking along a bunch of movies for the kids - and the quality of the PSP screen as well as H.264 encoded DVDs is a marriage made in heaven) :)
 
re. .264 encoding - is this GPU utilization something utilities like handbrake must build into their app or is it automatically enabled via the OS?

I ripped a DVD last night with handbrake with my new 2.4 MB and it took pretty much the same time as my 2.0 iMac!

Its likely that the bottleneck for ripping DVDs is the drive, not the processor speed. If you wanted, you could dump the DVD onto the hard drive then compare speeds when converting on each machine - this would eliminate the bottleneck. :)
 
I'm surprised that H.264 acceleration is new: Apple has long had MPEG-2 acceleration, and H.264 is based on MPEG-2.

I've noticed on both Macs and Windows that Quicktime decode of h.264 depends upon who encoded it. Decode of FFMPEG-created 264 takes much more CPU than Quicktime-created 264.

Some movies will probably still be docoded by the CPU because they are "high profile H2.64". The codec has quite a lot of settings to tweak so you can achieve a higher quality with less bitrate, encoding and decoding are then more CPU intensive.

The iPhone for isnstance can only play videos with low profile settings, 1500 kbit/s and 640x480 resoultion. That's what the decoding hardware was made for, it won't play anything exceeding those parameters. The hardware in the new Macbooks is probably no different; it will decode videos within certain parameters.


FFMPEG has settings that specifically say "will break quicktime compatibility". Quicktime won't play those files unless you installed Perian. So make sure your encoded videos are Quicktime compatible if you want them to use the hardware. Appl might just have optimized Quicktime to use features from these new CPUs (procesors) though.
 
re. .264 encoding - is this GPU utilization something utilities like handbrake must build into their app or is it automatically enabled via the OS?

I ripped a DVD last night with handbrake with my new 2.4 MB and it took pretty much the same time as my 2.0 iMac!

Handbrake relies on open source libraries to do the actual encoding. And these open source libraries are completely portable, don't rely on Macintosh-specific features at all, and certainly don't use GPU hardware. Last time I checked, they don't even use vector instructions either on PowerPC or on Intel CPUs.
 
In order for these results to make sense you have to test the same movie at the same timeframe.

Otherwise the results will vary a great deal, seeing as not all scenes have the same mbit rating.
 
Simply wrong. See Wikipedia. H.264 is simply MPEG-2 plus some obvious corrections for artifacts that I see daily in American MPEG-2 HDTV broadcasts.

In the same sense that a Ferrari is just a Ford T model with some obvious corrections for ugly shape, boring colour and lousy top speed.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.