Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This could refer to capabilities built in to the gfx system chipset. Don't some of the Intel gfx boards claim this stuff already?
 
All Intel Macs (with Nvidia and ATI graphics cards) should already have hardware H.264 decoding. IIRC the ATI cards can even do hardware encoding (well ATI were working on this a while back, but haven't heard much since).

So it should just be a matter of enabling those features.

That was my thought. Why put a dedicated chip in when you already have a $50 chip in there that does what you want.

I don't think the X1600 had the encoder built in but the newer models do. It's just a matter of turning it on. However if it is built into the graphics cards then the faster the card, the faster the encoding.
 
All Intel Macs (with Nvidia and ATI graphics cards) should already have hardware H.264 decoding. IIRC the ATI cards can even do hardware encoding (well ATI were working on this a while back, but haven't heard much since).

So it should just be a matter of enabling those features.
Please enlighten me further. :)
Does this mean that my iMac (latest model) will (can) have this feature?
If so, how will it work? :confused:
 
First of all, yeah right, second of all, why? While H.264 produces smaller files, they're still massive compared to YouTube-ish vids, and even on the latest Core2Duo systems, they tax the processor. I know, I know, "but the quality is better!" you say. Yep, but YouTube, and the like, has shown us that most people don't give a toss about the quality, just the bare minimum to get by.

No, if anything, this is for converting DVD's for "personal" use, or it's not going to happen at all. I'd love to have process specific dedicated hardware for lots of things (think what the Cell proc was designed for), and I hope that happens like it did for GFX cards.

offer H.264 encoding to allow users to quickly encode high quality video clips for upload to the internet.
 
So a 50$ chip will be faster than the core 2 duo monsters?

Creative's $40 chip does 4.8 billion floating-point operations per second. Which is not quite the same as a core 2 duo, but then it only takes 1 Watt. That's just enough for 720p H.264 decoding (google for DMS-02). Should be able to do 1080p if you don't mind using 4 Watt instead.
 
Please enlighten me further. :)
Does this mean that my iMac (latest model) will (can) have this feature?
If so, how will it work? :confused:

Do a search for AVIVO. IIRC the X1600 doesn't support it (no hardware built in) but it's successor will have it as does the X1800 (or X1850) that the Mac Pro has.
 
This might just be me being stupid but $50 (or €50 as it's going to be converted into :mad: ) is a bit of a price to pay for faster ripping of DVD's... or is there more to this that i'm missing???
 
First of all, yeah right, second of all, why? While H.264 produces smaller files, they're still massive compared to YouTube-ish vids, and even on the latest Core2Duo systems, they tax the processor. I know, I know, "but the quality is better!" you say. Yep, but YouTube, and the like, has shown us that most people don't give a toss about the quality, just the bare minimum to get by.

No, if anything, this is for converting DVD's for "personal" use, or it's not going to happen at all. I'd love to have process specific dedicated hardware for lots of things (think what the Cell proc was designed for), and I hope that happens like it did for GFX cards.

for watching on internet via the likes of youtube, no one bothers with quality because they are more interested in a quick laugh or something, but put it on a television or sit back in front of your imac or apple cinema display to watch a movie that you own and i think most people would begin to complain about pixels that suddenly are the size of your pinky finger's nail.

for sure, this is a very important step, especially if it is to be a cross-the-board operation incorporating their entire line of computers. this is a reason that apple are amazing an amazing company that while large retains creativity.

that being said, this still has to be proven. if an analyst is talking about a dedicated chip that they will put in, it is definately not the current ati chip as the not all imac (education model) nor mini have the base of the ati. could it be a future iteration of the graphix engine? maybe, but i reckon it might be a separate processor. any way it appears, bring it on -- about time
 
Do a search for AVIVO. IIRC the X1600 doesn't support it (no hardware built in) but it's successor will have it as does the X1800 (or X1850) that the Mac Pro has.
:confused:
Ready to get the full impact of video and display perfection?
Enjoy the quality and performance of ATI Avivo in any of the ATI Radeon® X1K products, ATI Mobility™ Radeon X1K products, ATI All-in-Wonder® X1K products, ATI TV Wonder™ Elite, ATI HDTV Wonder™, and partner products based on ATI Theater™ 550 PRO and the all new ATI Theater™ 650 PRO technology.
:confused:

Doesn't "ATI Mobility™ Radeon X1K products" include the X1600? :confused:
 
:confused: :confused:

Doesn't "ATI Mobility™ Radeon X1K products" include the X1600? :confused:

Well, i've been known to be wrong and it seems the X1600 does have Avivo. It must have been the last generation that didn't have it in the mainstream chips.

EDIT: Well it could be that they lack the power to encode, whereas the X1800 can encode.
 
$100

A $50 chip means a $100 difference in retail price. Does anyone really think they're going to raise the price of the Mac Mini by $100 when the Mini can already playback H.264 when not heavily loaded? That's a lot of money for something that is only useful to people like me who rip DVDs.
 
If it is a $50 chip, it better do encoding. Even high definition H.264 decoders are in the $10 price range. These kind of chips from Sigma, etc. are going into next gen DVD players...

By the way, the latest generation graphics chip can do a significant part of the decoding and they can even help with encoding to some extent.
 
Something else that this would allow is fast and presumably legal ripping of DVDs to iPods. That is, every Mac has a license to decode DVDs and the act of format shifting DVDs is legal, only removing the encryption is. Since you can legally decrypt a movie on any Mac, piping the data into a hardware encoder for realtime h.264 rather than a display should be a legal task. Not sure if they'd be allowed to do it faster than realtime, but realtime at least.

Then again, a chip like this is not worth it in the Mac Pro. Already I can hit realtime on 3 simultaneous encodes using a pre-release build of MediaFork (before they added a couple bad bugs which cripple simultaneous encodes). I don't need a hardware chip to hit realtime on a single encode, even on my MacBook.

IJ Reily said:
Maybe I'm misunderstanding, but I think Cringley is suggesting something different, which is h.264 encoding and decoding in hardware on all Macs. Has anyone done that yet? Would it have the benefits he describes?

I was addressing that, by saying that H.264 /hardware/ encoding/decoding at this point is already 'too expensive' to be worth doing. H.264 at 1080p decoding can be done /now/ in software in Apple's entire current line. That leaves encoding as the main benefit. Already, h.264 can encode at realtime on some of Apple's product line.

For Apple to be willing to drop 50$ on an h.264 chip, they either are expecting to shove HD-DVD or Blu-Ray into the entire line (not likely, I am not sure people want a 1000$ mini with 512MB of RAM and no GPU)... or they are expecting to do something HUGE with encoding (Media center competition?). By the time the chip starts doing good things for Apple's sales, software will be able to nearly overtake the hardware, and they take it back out again.

Cringely's musings, while entertaining, have really made me wonder why I looked up to the guy when I was younger. His point that this chip is superior to a software solution banks on this 'cheap' chip doing things better than a CPU, but misses the point of this cheap, specialized chip in the first place. A DVD player needs to be cheap, and you can't go around putting a 200$ CPU into it just to play DVDs... you build a custom DSP for DVDs, along with a 10$ CPU, and save more than 50% on your costs. A Computer is already expensive, so using a bunch of specialized chips doesn't make much sense, when the 200$ CPU /can/ do it all.
 
This might just be me being stupid but $50 (or €50 as it's going to be converted into :mad: ) is a bit of a price to pay for faster ripping of DVD's... or is there more to this that i'm missing???

Well, all Apple laptops now have built in cameras. I have read rumors that new Apple displays will also have built in cameras (instead of attaching an iSight camera). The new iPhone has a built in camera.

Ubiquitous H.264 hardware encoding would enable high-quality video chatting and conferencing just about anywhere. And you would get a network-effect, where the more people who have this on their devices, the more valuable the device is for each person. (Hence, the reason for making it standard.) It would be especially cool for the iPhone (if apple also provides a camera lens on the same side as the screen) -- just pull out your iPhone while at Starbucks (using WiFi ... I don't know about AT&T's Edge) and have a video chat with your girlfriend.

I don't know if anyone remembers the video communication device in the TV series "Earth: Final Conflict", but this could come close to making it real.
 
If it is a $50 chip, it better do encoding. Even high definition H.264 decoders are in the $10 price range. These kind of chips from Sigma, etc. are going into next gen DVD players...


Partially correct. I suspect Robert X. is talking about the Sigma chip that is heading into the 2nd Gen Blu-Ray players (and possibly HD DVD if it survives long enough); they aren't heading for vanilla DVD players. If you want Blu-Ray, either wait for the 2nd Gen players or buy a Sony PS3 because the Broadcom chip that ships with the Samsung Blu-Ray player and the Toshiba HD DVD players suck. Just to even output in 1080i properly, Samsung had to add a co-processing chip to it. And again, this is all the fault of Toshiba for early shipping HD DVD in an attempt to foil the rest of the consumer electronics industry with their support for Blu-Ray before either platform was ready for a proper launch.

Then again, perhaps Robert X. thinks the chip in question is a Cell processor. The Cell does a fabulous job decoding video on the Sony PS3 and it would be in Sony's best interest to have a second (or third, or more) company out there ordering chips in order to lower the manufacturing costs. The only problem is the yields haven't been terrific, from past reports. Of course, in all fairness, most Blu-Ray titles are still encoded in either VC-1 or MPEG2, and not H.264 MPEG4 AVC (X-Men 3 comes to mind), so even the current output of the Cell is debatable amongst the fanbois and the h8ers.
 
So if I am reading this correctly, the current MacBook has the H.264 Decoder in it...we just need to unlock it? Possibly with a driver like 802.11n?
 
This guy couldn't read the ingredients off of a box of cereal and get it correct. A chip cost cited at $50 is meaningless, Apple has huge buying power and can negotiate prices way, way down below even wholesale prices (buying RAM from Apple is one of biggest rip-offs ever).

Furthermore Apple isn't going to simply "eat" the cost, it will be figured in somehow with reduced costs of some other component or amortized over time as with every other product roll out.

This may or may not happen, but with this guy place your money on it NOT happening (the way he described it, at least).
 
First of all, yeah right, second of all, why? While H.264 produces smaller files, they're still massive compared to YouTube-ish vids, and even on the latest Core2Duo systems, they tax the processor. I know, I know, "but the quality is better!" you say. Yep, but YouTube, and the like, has shown us that most people don't give a toss about the quality, just the bare minimum to get by.

Which is exactly why I don't "Browse YouTube". It's bad enough that a lot of it is home brewed garbage, but even with stuff that is quality content, the low-quality presentation of that content just turns me off. And, I'd certainly never post anything there if it means that my video turns to mush.

Anything that can be moved from the CPU to dedicated hardware is fine by me (as long as the cost to do so is within reason).
 
So if I am reading this correctly, the current MacBook has the H.264 Decoder in it...we just need to unlock it? Possibly with a driver like 802.11n?

kind of, the the 802.11n was more of a firmware patch and a driver. it turned some stuff on at the chip level too. But a similar thing could be done to the graphic chips.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.