Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This could be one of those misinterpreted rumors. it seems like they are finding a way for the GPU to do it as that is the coming trend. :cool:
 
Video cards are not made to encode and de-code H.264. The processor still does that before passing it on to the video card.

Yes they are. ATI/AMD have UVD, NVIDIA have PureVideo. Intel now have a unit in their latest chipset. This isn't new stuff either, UVD and PureVideo have been around a couple of years, and they have a real noticeable benefit and reduce power consumption in mobile platforms too.
 
A couple of years ago at MWSF I asked an Apple laptop engineer about this as a recent article had mentioned that the GPU could do H.264 encoding. He said they weren't doing that as they got better quality output by doing it in software. That raises the question: Are all H.264 encoders the same or is there room in the implementation to trade off final quality for speed or simplicity of the algorithm? So if Apple decides to add a chip for H.264 encoding it may have to do with any number of other features besides simply saying it does H.264 encoding in hardware.

Long ago the first PowerBook to include a DVD drive had an external device that plugged into the PC card slot that accelerated DVD decoding. (Historical similarity.)
 
Perhaps we should not take the comments as literal - as in Apple will add a chip to solely handle "Quicktime" acceleration in addition to the integrated or separate GPU already found on their machines.

What this could mean is that all Macs will have a separate GPU and not rely on integrated graphics - or at least Intel integrated graphics. Many of us (myself included) assumed Apple would adopt Montevina and the X4500 IGP, but now we have had rumors that Apple might be moving the Mac line (sans Mac Pro) to the nVidia mobile chipset.

Now, I imagine the nVidia chipset will have an IGP option, but I expect it will be superior to the X4500 so even if Apple goes that way for the MacBook and Mac Mini, it could very well improve h.264 performance beyond what we'd get with the X4500. And should Apple just use a separate GPU with the nVidia chipset, performance should be even better.
 
Yes they are. ATI/AMD have UVD, NVIDIA have PureVideo. Intel now have a unit in their latest chipset. This isn't new stuff either, UVD and PureVideo have been around a couple of years, and they have a real noticeable benefit and reduce power consumption in mobile platforms too.

Those are both DEcoders, whereas this new chip would be mainly used to speed up encoding.

Blu-ray Macbooks & Macbook Pro's with iSight HD in September FTW!

Fast HD video encoding & blu-ray burning with your iSight or HD cam - it's all coming together! This is how apple will "differentiate from competitors" in 2008/2009 and incur slight losses on each computer sold due to the expensive blu-ray drive which Apple will nicely not hike the price up for.
 
Hi all, I'm asking this from a position of total ignorance as to how these things work, but would using a dedicated chip improve battery life over how things are currently done, in which case could this be a significant thing for notebooks?
 
I stand corrected on that fact. But that doesn't really change the issue I was responding to: The majority of Macs out there don't have hidden hardware that can just be 'turned on.'

I was incorrect in the way I responded to that, but the central point remains the same.

Let's see.

Mac Pro, Macbook pro and iMac - all cards include h.264 decoding, at least one says it does encoding. Looks like only the Macbook, Air, and Mini don't have that "hidden" hardware. So no, you were wrong on this one. As I said in my first post, most mac models already have hardware that does this, so I don't get what this rumor is supposed to be about.

Can anyone clarify for sure that OSX doesn't take advantage of this hardware acceleration?

Hi all, I'm asking this from a position of total ignorance as to how these things work, but would using a dedicated chip improve battery life over how things are currently done, in which case could this be a significant thing for notebooks?

Yes. And many (non-apple) notebooks are doing this already.
 
[snip]

And of course, all discrete GPUs in the current Apple lineup already accelerate h.264 decode with Blu-ray support, although better drivers are probably needed. Intel's GMA X4500 also provides h.264 decoding acceleration so decoding h.264 is hardly a feature that needs a separate chip.

Apple isn't going to use any more Intel onboard graphics in upcoming Macbooks/Mac Mini/Apple TV. That's one of the reasons they purchased their own chip company.

HDMI ports for every Mac.;)
 

We're talking real-time, uncompressed HD not "good enough, compressed HD content that isn't real-time." We're also hinting at real-time raytracing that needs an chipset that OpenCL can leverage without taxing the GPU or any free cores.
 
We're talking real-time, uncompressed HD not "good enough, compressed HD content that isn't real-time." We're also hinting at real-time raytracing that needs an chipset that OpenCL can leverage without taxing the GPU or any free cores.

Actually, I assume you mean HD compressed at a higher data rate.

Uncompressed HD is only used in things like film and HDTV production. It would make no sense to include dedicated hardware for that standard on a consumer box, or even on the Mac Pro. Hardware that can do real time uncompressed HD needs to be beefy and it's still expensive, at least too expensive to make a standard inclusion.

Most graphics cards can decode HD content at rates comparable to things like Bluray and broadcast HDTV. And for most users, that's plenty good enough...or at least it would be if apple supported the hardware instead of letting it sit there unused.
 
We're talking real-time, uncompressed HD not "good enough, compressed HD content that isn't real-time."

And real-time, uncompressed HD is exactly what the video cards deliver.

anandtech.com said:
the GeForce 8600, it is currently the best option for anyone looking to watch Blu-ray or HD-DVD on their PCs. The full H.264 offload onto the GPU makes HD movie playback not only painless but also possible on lower speed systems.


We're also hinting at real-time raytracing that needs an chipset that OpenCL can leverage without taxing the GPU or any free cores.

Uh, no, that wasn't at all what was hinted at in any way.
 
blu ray

I was going to say, as I was reading this rumor: "What about Blu-Ray"? Then the rumor at the end mentioned it. BUT I THOUGHT OF IT FIRST! YAR!

Seriously, where is Apple with Blu-Ray super drives. It is borderline unacceptable that Apple does not sell this yet, especially since Apple is on the Blu-Ray board!
 
What's the point in adding another chip when you could put a better graphics card in there instead? Plus isn't QuickTime being re-written to improve performance with Snow Leopard?

Sounds like a rumour or two gone through a game a Chinese whispers.
 
Depending on what this chip really does, it may allow Apple to support Blu-Ray movie playback without ruining the rest of Mac OS X.

One of the big problems with Windows Vista is that it supports software playback of HD content. Thanks to the requirements imposed by the movie studios, Microsoft has had to specify and develop (whether it works or not) massive amounts of DRM infrastructure, including CODECs that keep the video encrypted during the decoding process, the ability to selectively detect and disable video cards that don't observe the rules, tilt-bits to detect bus-snoopers, etc.

I suspect Apple doesn't want to play this game. If I were them, I'd refuse to compromise a good multimedia architecture simply because some movie studios demand it.

One possible way around all this might be a dedicated HD-decode chip, as a part of the video card. Mac OS can then feed the raw, encrypted, content from the disc straight to the chip, where the video goes straight to the video output, without ever passing through main memory. Sort of like the overlay model used by video-playback cards, back before CPUs were powerful enough to play real-time video.

This way, the studios get their insane encryption requirements met and Mac OS doesn't get crippled by the attempt to enforce it.

And, of course, when you're not playing a BD movie, you've got a nice powerful auxiliary GPU that can be used for whatever else the system needs.
 
That'd be hot if the Mac Pros and high end iMac started including Blu-Ray drives.
 
Seriously, where is Apple with Blu-Ray super drives. It is borderline unacceptable that Apple does not sell this yet, especially since Apple is on the Blu-Ray board!
The big problem (see my previous post) is that BD movie playback can't be done without complying with a huge set of insane DRM requirements.

Apple won't ship a BD drive until they can provide a way to play BD movies, otherwise customers will complain a lot more than they are now (about not having the drive.)

This sucks, because people (like me) want a BD-RE drive for use as a data drive, for use as a backup device, and don't care about playing movies (of any kind) on the computer. But I'm certain that I'm in a clear minority with that opinion.
 
Dedicated video hardware? Yeah, it's called a GPU. :rolleyes:

This is not going to happen.

edit:

[other stuff]

This sucks, because people (like me) want a BD-RE drive for use as a data drive, for use as a backup device, and don't care about playing movies (of any kind) on the computer. But I'm certain that I'm in a clear minority with that opinion.

dude, you can totally buy a blu-ray drive off of newegg and stick it in your mac pro and use it for data burning of blu-ray discs. not even joking.
 
Its great that we are catching up. Don't Nvidia and ATI cards already support fast rendering or playback of these files already??

I would hope that Apple would also release some more Snow Leopard info as well. We really need MultiCPU support as well.
 
It sounds like it would've been useful 3 years ago, but not today.

AFAIK, I assume that even the lowest end current macbook can encode and decode h.264 at a reasonable rate without a dedicated chip. Heck, my 5 year old 1 ghz G4 powerbook can encode h.264 off a DVD in very high quality at something like 8 times longer than real time. So I would presume (and hope) that even without a dedicated chip a brand new upcoming late 2008 / early 2009 macbook should be able to do stuff more than 8 times faster!

That's my thinking as well. Might have been worthwhile a ... well, a while ago, but it's really needed in older machines, not newer ones. Increasingly powerful processors have pretty much rendered it obsolete. It's kind of like running a Radeon 9600 Pro in a Intel 2 Core 3 GHz + system.... it can probably run games just as fast with software rendering.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.