PDA

View Full Version : Apple Adding H.264 Hardware Decoder Chip to Macs?




mrgreen4242
Mar 9, 2007, 11:49 AM
Not sure where the best place to have this discussion is, but Cringley is at it again, this time speculating (well, hinting that he knows) that Apple will be putting h.264 encoding/decoding hardware in all Macs as soon as this year.

Here's the article:
http://www.pbs.org/cringely/pulpit/2007/pulpit_20070308_001806.html



psychofreak
Mar 9, 2007, 11:56 AM
I don't really see much point in the mac mini for example...

Diode
Mar 9, 2007, 12:00 PM
Could be help for apple tv users needing to convert a butload of media.

TBi
Mar 9, 2007, 12:06 PM
The new ATi chips have H.264 decode/encode built in (AVIVO). All this means is that all the Mac's will be getting ATi chips (or the nVidia equivalent). There won't be a dedicated chip other than that.

I don't get what he thinks the big hoopla is about. Anyone with half a brain could have predicted this. (Except maybe in the Mini or Macbook which don't have ATi chips but may in the future have them)

mrgreen4242
Mar 9, 2007, 12:06 PM
I don't really see much point in the mac mini for example...

Um. That's exactly the point... even the slower/less expensive machines would have the same baseline for video performance as the most expensive machines. The benefits outside of video production is in faster than realtime transferring of video to iPods/iTV/websharing.

Apple could sell HD movies and iTunes could transcode that video to an iPod sized file, etc etc. Assuming it did MPEG2 as well as h264 (not unlikely if it happened) it would benefit iDVD which is something that people are going to be using more and more for home movies, photo slideshows, amateur productions, etc etc.

Anyways, if they do this, I expect to see it more as a Core Video like implementation where supported GPUs take over encoding/decoding work which would give a baseline level of support to all Macs but would give people incentive to upgrade to more expensive machines/GPUs.

TBi
Mar 9, 2007, 12:12 PM
Um. That's exactly the point... even the slower/less expensive machines would have the same baseline for video performance as the most expensive machines. The benefits outside of video production is in faster than realtime transferring of video to iPods/iTV/websharing.


If it is a dedicated chip then they will all be as fast, but if like i'm thinking it will just be an ATi graphics chip doing the encode then the faster macs (with faster graphics) will still have an edge speedwise.

However, if true and there will be a dedicated chip then that would add credence to the rumor that they will be announcing special hardware accelerators for the Mac Pro.

IJ Reilly
Mar 9, 2007, 12:23 PM
I don't get what he thinks the big hoopla is about. Anyone with half a brain could have predicted this. (Except maybe in the Mini or Macbook which don't have ATi chips but may in the future have them)

I don't see any big hoopla, but riddle me this: if this is such an obvious development, then why hasn't anyone done it yet?

nateDEEZY
Mar 9, 2007, 12:33 PM
This would be a pretty significant update to owners who currently have 24" iMac's and use them as there media center.

An update like this would really make me consider selling my 24" iMac.

Power supply temperature of 186 degree's ferhanheit while converting 720p 59.7fps recording to be able to play on a dvd player and sustaining those high temps for about an hour trying to do the encoding and then burn it. All the while using 98% of the cpu power

aristobrat
Mar 9, 2007, 12:33 PM
Yay.

Then maybe the "vencoder" process won't use 40% of my CPU when I'm in a video iChat. :)

TBi
Mar 9, 2007, 12:42 PM
I don't see any big hoopla, but riddle me this: if this is such an obvious development, then why hasn't anyone done it yet?

I don't think the X1600 comes with Avivo but the newer chips from ATi do (the X1650 or something). More than likely Apple is going to upgrade all it's systems with these new Avivo chips.

If you have a PC with an AVIVO enabled ATi card you can download a special decoder from ATi.

(Looking at it i'm not sure if it can do hardware encode of H.264 at the moment but with the programmable pipeline i'd say it is possible to code this in in the future.

Krevnik
Mar 9, 2007, 01:06 PM
I don't see any big hoopla, but riddle me this: if this is such an obvious development, then why hasn't anyone done it yet?

Usually cost is a reason. Including a hardware chip for DSP tends to not be the cheapest thing in the world (a decent one is easily around the same cost as a GPU), and using special drivers for a feature currently in a small line of GPUs isn't a great use of money either.

Take the life of hardware DVD decoders in Macs... it lasted a whopping 3 products until the G4 hit, which closed the gap between software and hardware decoding (at least enough to not justify a 25-50$ chipset being added or sold as upgrades). The Lombard, the Pismo, and the B&W G3.

Already, H.264 decoding on recent Macs can do 1080p or get darn close to it. The main benefit of a H.264 chip is transcoding or encoding, which without a user scenario that makes sense (i.e... if users of an Apple app or peripheral can benefit greatly from it), isn't worth the trouble implementing.

I see this more being useful in Final Cut for HD authoring, or in iTunes for HD-DVD/Blu-Ray 'Managed Copy'.

IJ Reilly
Mar 9, 2007, 01:12 PM
I don't think the X1600 comes with Avivo but the newer chips from ATi do (the X1650 or something). More than likely Apple is going to upgrade all it's systems with these new Avivo chips.

If you have a PC with an AVIVO enabled ATi card you can download a special decoder from ATi.

(Looking at it i'm not sure if it can do hardware encode of H.264 at the moment but with the programmable pipeline i'd say it is possible to code this in in the future.

Usually cost is a reason. Including a hardware chip for DSP tends to not be the cheapest thing in the world (a decent one is easily around the same cost as a GPU), and using special drivers for a feature currently in a small line of GPUs isn't a great use of money either.

Take the life of hardware DVD decoders in Macs... it lasted a whopping 3 products until the G4 hit, which closed the gap between software and hardware decoding (at least enough to not justify a 25-50$ chipset being added or sold as upgrades). The Lombard, the Pismo, and the B&W G3.

Already, H.264 decoding on recent Macs can do 1080p or get darn close to it. The main benefit of a H.264 chip is transcoding or encoding, which without a user scenario that makes sense (i.e... if users of an Apple app or peripheral can benefit greatly from it), isn't worth the trouble implementing.

I see this more being useful in Final Cut for HD authoring, or in iTunes for HD-DVD/Blu-Ray 'Managed Copy'.

Maybe I'm misunderstanding, but I think Cringley is suggesting something different, which is h.264 encoding and decoding in hardware on all Macs. Has anyone done that yet? Would it have the benefits he describes?

TBi
Mar 9, 2007, 01:48 PM
Maybe I'm misunderstanding, but I think Cringley is suggesting something different, which is h.264 encoding and decoding in hardware on all Macs. Has anyone done that yet? Would it have the benefits he describes?

Thing is that if you had full AVIVO then that would be hardware encoding/decoding. Just built into the GPU and not seperate.

IJ Reilly
Mar 9, 2007, 03:33 PM
Thing is that if you had full AVIVO then that would be hardware encoding/decoding. Just built into the GPU and not seperate.

I get that, but if Cringley is right, Apple is planning on incorporating full h.264 support into their entire product line. This implies something more than upgrading the GPU, since both the mini and the MacBook don't have a GPU. FWIW, Apple has some history of building Macs with hardware encoding/decoding -- the Quadra 660av and 840av models, almost 15 years ago.

mrgreen4242
Mar 9, 2007, 03:53 PM
I get that, but if Cringley is right, Apple is planning on incorporating full h.264 support into their entire product line. This implies something more than upgrading the GPU, since both the mini and the MacBook don't have a GPU. FWIW, Apple has some history of building Macs with hardware encoding/decoding -- the Quadra 660av and 840av models, almost 15 years ago.

The Lombard and Wallstreet PowerBooks and B&W PowerMac had optional hardware MPEG2 decoding as well. Not encoding, though.

Something else that this would allow is fast and presumably legal ripping of DVDs to iPods. That is, every Mac has a license to decode DVDs and the act of format shifting DVDs is legal, only removing the encryption is. Since you can legally decrypt a movie on any Mac, piping the data into a hardware encoder for realtime h.264 rather than a display should be a legal task. Not sure if they'd be allowed to do it faster than realtime, but realtime at least.

So, this could be a way for Apple to incorporate DVD ripping into iTunes...

MacRumors
Mar 9, 2007, 06:03 PM
http://www.macrumors.com/images/macrumorsthreadlogo.gif (http://www.macrumors.com)

Robert Cringley claims (http://www.pbs.org/cringely/pulpit/2007/pulpit_20070308_001806.html) that Apple is planning on incorporating dedicated H.264 decoding chips into future Mac hardware.

Now comes the rumor I have heard, that I believe to be a fact, that has simply yet to be confirmed. I have heard that Apple plans to add hardware video decoding to ALL of its new computers beginning fairly soon, certainly this year.

The article claims that incorporating a dedicated H.264 decoding chip will allow Apple to ensure the same base performance on every machine it sells. The $50 chip is said to also offer H.264 encoding to allow users to quickly encode high quality video clips for upload to the internet.

~Shard~
Mar 9, 2007, 06:05 PM
Nice - yet another thing to look forward to when I buy a new Mac portable in a few months - assuming this is true... ;) :cool:

crap freakboy
Mar 9, 2007, 06:09 PM
Well, if that is true then it can only be a good thing.
For that small sum it would be madness not to include that magnitude of encoding speed across the board. Handbrake users rejoice. I am drunk.:)

Fearless Leader
Mar 9, 2007, 06:10 PM
So a 50$ chip will be faster than the core 2 duo monsters?

Nermal
Mar 9, 2007, 06:12 PM
It's not so much that it'll be faster, but that it won't slow down the system. If it's sitting on a dedicated chip, then you still have 100% of your CPU available for other stuff.

FoxyKaye
Mar 9, 2007, 06:12 PM
It would certainly be a welcome addition on my next Mac - because encoding H.264 on my current iMac is *woof, woof* dog slow. Also bodes well for the HD market and the general direction FCP and iLife apps are taking.

psychofreak
Mar 9, 2007, 06:12 PM
Well, if that is true then it can only be a good thing.
For that small sum it would be madness not to include that magnitude of encoding speed across the board. Handbrake users rejoice. I am drunk.:)
It would be great for people who want to quickly rip a DVD for watching on :apple: tv

Fearless Leader
Mar 9, 2007, 06:14 PM
It's not so much that it'll be faster, but that it won't slow down the system. If it's sitting on a dedicated chip, then you still have 100% of your CPU available for other stuff.

don't know why I didn't think of that. I wonder if the apple tv is using this?

hvfsl
Mar 9, 2007, 06:14 PM
All Intel Macs (with Nvidia and ATI graphics cards) should already have hardware H.264 decoding. IIRC the ATI cards can even do hardware encoding (well ATI were working on this a while back, but haven't heard much since).

So it should just be a matter of enabling those features.

LeviG
Mar 9, 2007, 06:18 PM
It shouldn't be hard with newer hardware to use the gpu processing power to supplement cpu power. Folding@home has a version of the software that can work on nvidia/ati gfx cards on windows systems so I don't see why a similar approach couldn't be taken by Apple for h264 encoding etc.

Kingsly
Mar 9, 2007, 06:19 PM
Hardware decoding and encoding? Yess! :)

Nicky G
Mar 9, 2007, 06:22 PM
This could refer to capabilities built in to the gfx system chipset. Don't some of the Intel gfx boards claim this stuff already?

TBi
Mar 9, 2007, 06:22 PM
All Intel Macs (with Nvidia and ATI graphics cards) should already have hardware H.264 decoding. IIRC the ATI cards can even do hardware encoding (well ATI were working on this a while back, but haven't heard much since).

So it should just be a matter of enabling those features.

That was my thought. Why put a dedicated chip in when you already have a $50 chip in there that does what you want.

I don't think the X1600 had the encoder built in but the newer models do. It's just a matter of turning it on. However if it is built into the graphics cards then the faster the card, the faster the encoding.

IEatApples
Mar 9, 2007, 06:23 PM
All Intel Macs (with Nvidia and ATI graphics cards) should already have hardware H.264 decoding. IIRC the ATI cards can even do hardware encoding (well ATI were working on this a while back, but haven't heard much since).

So it should just be a matter of enabling those features.
Please enlighten me further. :)
Does this mean that my iMac (latest model) will (can) have this feature?
If so, how will it work? :confused:

Some_Big_Spoon
Mar 9, 2007, 06:23 PM
First of all, yeah right, second of all, why? While H.264 produces smaller files, they're still massive compared to YouTube-ish vids, and even on the latest Core2Duo systems, they tax the processor. I know, I know, "but the quality is better!" you say. Yep, but YouTube, and the like, has shown us that most people don't give a toss about the quality, just the bare minimum to get by.

No, if anything, this is for converting DVD's for "personal" use, or it's not going to happen at all. I'd love to have process specific dedicated hardware for lots of things (think what the Cell proc was designed for), and I hope that happens like it did for GFX cards.

offer H.264 encoding to allow users to quickly encode high quality video clips for upload to the internet.

gnasher729
Mar 9, 2007, 06:26 PM
So a 50$ chip will be faster than the core 2 duo monsters?

Creative's $40 chip does 4.8 billion floating-point operations per second. Which is not quite the same as a core 2 duo, but then it only takes 1 Watt. That's just enough for 720p H.264 decoding (google for DMS-02). Should be able to do 1080p if you don't mind using 4 Watt instead.

TBi
Mar 9, 2007, 06:27 PM
Please enlighten me further. :)
Does this mean that my iMac (latest model) will (can) have this feature?
If so, how will it work? :confused:

Do a search for AVIVO. IIRC the X1600 doesn't support it (no hardware built in) but it's successor will have it as does the X1800 (or X1850) that the Mac Pro has.

OwlsAndApples
Mar 9, 2007, 06:29 PM
Nice could-be addition to the speculated super-portable...:)

IEatApples
Mar 9, 2007, 06:31 PM
Do a search for AVIVO. IIRC the X1600 doesn't support it (no hardware built in) but it's successor will have it as does the X1800 (or X1850) that the Mac Pro has.Thanks! :)

MacAodh
Mar 9, 2007, 06:33 PM
This might just be me being stupid but $50 (or €50 as it's going to be converted into :mad: ) is a bit of a price to pay for faster ripping of DVD's... or is there more to this that i'm missing???

shigzeo
Mar 9, 2007, 06:35 PM
First of all, yeah right, second of all, why? While H.264 produces smaller files, they're still massive compared to YouTube-ish vids, and even on the latest Core2Duo systems, they tax the processor. I know, I know, "but the quality is better!" you say. Yep, but YouTube, and the like, has shown us that most people don't give a toss about the quality, just the bare minimum to get by.

No, if anything, this is for converting DVD's for "personal" use, or it's not going to happen at all. I'd love to have process specific dedicated hardware for lots of things (think what the Cell proc was designed for), and I hope that happens like it did for GFX cards.

for watching on internet via the likes of youtube, no one bothers with quality because they are more interested in a quick laugh or something, but put it on a television or sit back in front of your imac or apple cinema display to watch a movie that you own and i think most people would begin to complain about pixels that suddenly are the size of your pinky finger's nail.

for sure, this is a very important step, especially if it is to be a cross-the-board operation incorporating their entire line of computers. this is a reason that apple are amazing an amazing company that while large retains creativity.

that being said, this still has to be proven. if an analyst is talking about a dedicated chip that they will put in, it is definately not the current ati chip as the not all imac (education model) nor mini have the base of the ati. could it be a future iteration of the graphix engine? maybe, but i reckon it might be a separate processor. any way it appears, bring it on -- about time

IEatApples
Mar 9, 2007, 06:35 PM
Do a search for AVIVO. IIRC the X1600 doesn't support it (no hardware built in) but it's successor will have it as does the X1800 (or X1850) that the Mac Pro has.
:confused: Ready to get the full impact of video and display perfection?
Enjoy the quality and performance of ATI Avivo in any of the ATI Radeon® X1K products, ATI Mobility™ Radeon X1K products, ATI All-in-Wonder® X1K products, ATI TV Wonder™ Elite, ATI HDTV Wonder™, and partner products based on ATI Theater™ 550 PRO and the all new ATI Theater™ 650 PRO technology. :confused:

Doesn't "ATI Mobility™ Radeon X1K products" include the X1600? :confused:

nazmac21
Mar 9, 2007, 06:38 PM
:confused: :confused:

Doesn't "ATI Mobility™ Radeon X1K products" include the X1600? :confused:

Yes they do and I am waiting for ATI Radeon X2800XT Blue-Ray equipped Mac Pro

TBi
Mar 9, 2007, 06:40 PM
:confused: :confused:

Doesn't "ATI Mobility™ Radeon X1K products" include the X1600? :confused:

Well, i've been known to be wrong and it seems the X1600 does have Avivo. It must have been the last generation that didn't have it in the mainstream chips.

EDIT: Well it could be that they lack the power to encode, whereas the X1800 can encode.

termite
Mar 9, 2007, 06:47 PM
A $50 chip means a $100 difference in retail price. Does anyone really think they're going to raise the price of the Mac Mini by $100 when the Mini can already playback H.264 when not heavily loaded? That's a lot of money for something that is only useful to people like me who rip DVDs.

macshark
Mar 9, 2007, 06:47 PM
If it is a $50 chip, it better do encoding. Even high definition H.264 decoders are in the $10 price range. These kind of chips from Sigma, etc. are going into next gen DVD players...

By the way, the latest generation graphics chip can do a significant part of the decoding and they can even help with encoding to some extent.

Krevnik
Mar 9, 2007, 06:48 PM
Something else that this would allow is fast and presumably legal ripping of DVDs to iPods. That is, every Mac has a license to decode DVDs and the act of format shifting DVDs is legal, only removing the encryption is. Since you can legally decrypt a movie on any Mac, piping the data into a hardware encoder for realtime h.264 rather than a display should be a legal task. Not sure if they'd be allowed to do it faster than realtime, but realtime at least.

Then again, a chip like this is not worth it in the Mac Pro. Already I can hit realtime on 3 simultaneous encodes using a pre-release build of MediaFork (before they added a couple bad bugs which cripple simultaneous encodes). I don't need a hardware chip to hit realtime on a single encode, even on my MacBook.

Maybe I'm misunderstanding, but I think Cringley is suggesting something different, which is h.264 encoding and decoding in hardware on all Macs. Has anyone done that yet? Would it have the benefits he describes?

I was addressing that, by saying that H.264 /hardware/ encoding/decoding at this point is already 'too expensive' to be worth doing. H.264 at 1080p decoding can be done /now/ in software in Apple's entire current line. That leaves encoding as the main benefit. Already, h.264 can encode at realtime on some of Apple's product line.

For Apple to be willing to drop 50$ on an h.264 chip, they either are expecting to shove HD-DVD or Blu-Ray into the entire line (not likely, I am not sure people want a 1000$ mini with 512MB of RAM and no GPU)... or they are expecting to do something HUGE with encoding (Media center competition?). By the time the chip starts doing good things for Apple's sales, software will be able to nearly overtake the hardware, and they take it back out again.

Cringely's musings, while entertaining, have really made me wonder why I looked up to the guy when I was younger. His point that this chip is superior to a software solution banks on this 'cheap' chip doing things better than a CPU, but misses the point of this cheap, specialized chip in the first place. A DVD player needs to be cheap, and you can't go around putting a 200$ CPU into it just to play DVDs... you build a custom DSP for DVDs, along with a 10$ CPU, and save more than 50% on your costs. A Computer is already expensive, so using a bunch of specialized chips doesn't make much sense, when the 200$ CPU /can/ do it all.

Random Ping
Mar 9, 2007, 07:00 PM
This might just be me being stupid but $50 (or €50 as it's going to be converted into :mad: ) is a bit of a price to pay for faster ripping of DVD's... or is there more to this that i'm missing???

Well, all Apple laptops now have built in cameras. I have read rumors that new Apple displays will also have built in cameras (instead of attaching an iSight camera). The new iPhone has a built in camera.

Ubiquitous H.264 hardware encoding would enable high-quality video chatting and conferencing just about anywhere. And you would get a network-effect, where the more people who have this on their devices, the more valuable the device is for each person. (Hence, the reason for making it standard.) It would be especially cool for the iPhone (if apple also provides a camera lens on the same side as the screen) -- just pull out your iPhone while at Starbucks (using WiFi ... I don't know about AT&T's Edge) and have a video chat with your girlfriend.

I don't know if anyone remembers the video communication device in the TV series "Earth: Final Conflict", but this could come close to making it real.

Lynxpro
Mar 9, 2007, 07:00 PM
If it is a $50 chip, it better do encoding. Even high definition H.264 decoders are in the $10 price range. These kind of chips from Sigma, etc. are going into next gen DVD players...


Partially correct. I suspect Robert X. is talking about the Sigma chip that is heading into the 2nd Gen Blu-Ray players (and possibly HD DVD if it survives long enough); they aren't heading for vanilla DVD players. If you want Blu-Ray, either wait for the 2nd Gen players or buy a Sony PS3 because the Broadcom chip that ships with the Samsung Blu-Ray player and the Toshiba HD DVD players suck. Just to even output in 1080i properly, Samsung had to add a co-processing chip to it. And again, this is all the fault of Toshiba for early shipping HD DVD in an attempt to foil the rest of the consumer electronics industry with their support for Blu-Ray before either platform was ready for a proper launch.

Then again, perhaps Robert X. thinks the chip in question is a Cell processor. The Cell does a fabulous job decoding video on the Sony PS3 and it would be in Sony's best interest to have a second (or third, or more) company out there ordering chips in order to lower the manufacturing costs. The only problem is the yields haven't been terrific, from past reports. Of course, in all fairness, most Blu-Ray titles are still encoded in either VC-1 or MPEG2, and not H.264 MPEG4 AVC (X-Men 3 comes to mind), so even the current output of the Cell is debatable amongst the fanbois and the h8ers.

BLACK MAC
Mar 9, 2007, 07:10 PM
So if I am reading this correctly, the current MacBook has the H.264 Decoder in it...we just need to unlock it? Possibly with a driver like 802.11n?

Multimedia
Mar 9, 2007, 07:13 PM
I wrote this this morning before it got posted as news. Then I was gone when they finally did it. :) :

The Great Apple Video Encoder Attack of 2007: Cupertino plans to add H.264 hardware support to its entire line (http://www.pbs.org/cringely/pulpit/2007/pulpit_20070308_001806.html). I think this is huge and Home Page Cover Worthy of being listed as rumor news. Just another great reason to wait for the next big refresh after Leopard ships.:)

Sayer
Mar 9, 2007, 07:13 PM
This guy couldn't read the ingredients off of a box of cereal and get it correct. A chip cost cited at $50 is meaningless, Apple has huge buying power and can negotiate prices way, way down below even wholesale prices (buying RAM from Apple is one of biggest rip-offs ever).

Furthermore Apple isn't going to simply "eat" the cost, it will be figured in somehow with reduced costs of some other component or amortized over time as with every other product roll out.

This may or may not happen, but with this guy place your money on it NOT happening (the way he described it, at least).

Billy Boo Bob
Mar 9, 2007, 07:19 PM
First of all, yeah right, second of all, why? While H.264 produces smaller files, they're still massive compared to YouTube-ish vids, and even on the latest Core2Duo systems, they tax the processor. I know, I know, "but the quality is better!" you say. Yep, but YouTube, and the like, has shown us that most people don't give a toss about the quality, just the bare minimum to get by.

Which is exactly why I don't "Browse YouTube". It's bad enough that a lot of it is home brewed garbage, but even with stuff that is quality content, the low-quality presentation of that content just turns me off. And, I'd certainly never post anything there if it means that my video turns to mush.

Anything that can be moved from the CPU to dedicated hardware is fine by me (as long as the cost to do so is within reason).

Fearless Leader
Mar 9, 2007, 07:19 PM
So if I am reading this correctly, the current MacBook has the H.264 Decoder in it...we just need to unlock it? Possibly with a driver like 802.11n?

kind of, the the 802.11n was more of a firmware patch and a driver. it turned some stuff on at the chip level too. But a similar thing could be done to the graphic chips.

MacAodh
Mar 9, 2007, 07:24 PM
http://www.pbs.org/cringely/pulpit/2007/pulpit_20070308_001806.html - I see... so macbook meet :apple: tv... and have very sexy lovechild... if cost equals similar me thinks that would be fairly nice by the sounds :D :D :D

Eidorian
Mar 9, 2007, 07:25 PM
All Intel Macs (with Nvidia and ATI graphics cards) should already have hardware H.264 decoding. IIRC the ATI cards can even do hardware encoding (well ATI were working on this a while back, but haven't heard much since).

So it should just be a matter of enabling those features.

That was my thought. Why put a dedicated chip in when you already have a $50 chip in there that does what you want.

I don't think the X1600 had the encoder built in but the newer models do. It's just a matter of turning it on. However if it is built into the graphics cards then the faster the card, the faster the encoding.http://ati.com/technology/Avivo/index.html

Yeah Avivo is in there. It just needs to be enabled. Why put additional hardware when the GPU is there waiting to be used to encode/decode. I know it took some time for Windows get get a driver for it.

Most of the talk that I've heard is that it is ridiculous to go beyond what the GPU does.

bankshot
Mar 9, 2007, 07:42 PM
I dunno. Has Cringely ever been right about anything? That alone hurts the believability of this rumor. Call me a skeptic. :rolleyes:

jwsmiths
Mar 9, 2007, 07:47 PM
All Intel Macs (with Nvidia and ATI graphics cards) should already have hardware H.264 decoding. IIRC the ATI cards can even do hardware encoding (well ATI were working on this a while back, but haven't heard much since).

So it should just be a matter of enabling those features.

I remember when they first announced the iMac and MacBook Pro with the X1600 they touted the ability to do hardware H.264 decoding, don't think encoding was ever mentioned. This may be more or less in reference to the MacBook, Mac mini, edu-line iMac models that use the GMA950 though and presently don't do hardware H.264.

synth3tik
Mar 9, 2007, 07:48 PM
If only there was a way for the ones of us how just bough our machine to get this

Eidorian
Mar 9, 2007, 07:53 PM
I remember when they first announced the iMac and MacBook Pro with the X1600 they touted the ability to do hardware H.264 decoding, don't think encoding was ever mentioned. http://www.digit-life.com/articles2/video/avivo_1.html

http://www.chip.de/artikel/c1_artikel_17670022.html

casik
Mar 9, 2007, 07:54 PM
First of all, yeah right, second of all, why? While H.264 produces smaller files, they're still massive compared to YouTube-ish vids, and even on the latest Core2Duo systems, they tax the processor. I know, I know, "but the quality is better!" you say. Yep, but YouTube, and the like, has shown us that most people don't give a toss about the quality, just the bare minimum to get by.

No, if anything, this is for converting DVD's for "personal" use, or it's not going to happen at all. I'd love to have process specific dedicated hardware for lots of things (think what the Cell proc was designed for), and I hope that happens like it did for GFX cards.

watching video's on youtube makes me want to gag. they are horrid quality, and i would way rather wait a lil longer for them to download.

Babasyzygy
Mar 9, 2007, 07:55 PM
It would be especially cool for the iPhone (if apple also provides a camera lens on the same side as the screen) -- just pull out your iPhone while at Starbucks (using WiFi ... I don't know about AT&T's Edge) and have a video chat with your girlfriend.


I've been sure that Jobs kept serious features waiting for the final product announcement. I think that's it.

Seriously, that "smells" right as the "one more thing" - very technically feasible, and has a serious bang. Cue "Thus spake Zarathrustra." First note, show a clip of the commercial from the 90's with a video phone and the quote, "..and the company that'll bring it to you is AT&T." Second note, show a clip of the Macintosh (or NeXT cube, grin) introduction on a pedestal. Third note, show a clip of _2001: A Space Odyssey_, with the video phone and the AT&T symbol (Cingular is rebranding themselves as AT&T). Crescendo, cut to and fade, Phil Schiller chatting with, say, a simulated kid telling him that he loves him and wishes he could be there to tuck him in tonight.

Pure magic. Instant pre-sales through the roof. Stockholders, rejoice.

Seriously, I think you've nailed it. The bandwidth is there, the hardware capability is there, the marketing case for it is compelling. There's no reason not to do it, unless power consumption and space make it unpossible :) .

diamond3
Mar 9, 2007, 07:56 PM
Well, this might seem like a dumb question. Anyways, would this help for any other type of encoding? Like would this speed up the encoding in iDVD etc when compressing and encoding to burn a dvd? Also, if so would it help the processors out that much?

roland.g
Mar 9, 2007, 07:59 PM
I've been sure that Jobs kept serious features waiting for the final product announcement. I think that's it.

Seriously, that "smells" right as the "one more thing" - very technically feasible, and has a serious bang. Cue "Thus spake Zarathrustra." First note, show a clip of the commercial from the 90's with a video phone and the quote, "..and the company that'll bring it to you is AT&T." Second note, show a clip of the Macintosh (or NeXT cube, grin) introduction on a pedestal. Third note, show a clip of _2001: A Space Odyssey_, with the video phone and the AT&T symbol (Cingular is rebranding themselves as AT&T). Crescendo, cut to and fade, Phil Schiller chatting with, say, a simulated kid telling him that he loves him and wishes he could be there to tuck him in tonight.

Pure magic. Instant pre-sales through the roof. Stockholders, rejoice.

Seriously, I think you've nailed it. The bandwidth is there, the hardware capability is there, the marketing case for it is compelling. There's no reason not to do it, unless power consumption and space make it unpossible :) .

Sounds great, but seriously, reading your post, the excitement, how much coffee have you had today? :D

dmelgar
Mar 9, 2007, 08:04 PM
First of all, yeah right, second of all, why? While H.264 produces smaller files, they're still massive compared to YouTube-ish vids, and even on the latest Core2Duo systems, they tax the processor. I know, I know, "but the quality is better!" you say. Yep, but YouTube, and the like, has shown us that most people don't give a toss about the quality, just the bare minimum to get by.

No, if anything, this is for converting DVD's for "personal" use, or it's not going to happen at all. I'd love to have process specific dedicated hardware for lots of things (think what the Cell proc was designed for), and I hope that happens like it did for GFX cards.
h.264 isn't bigger. It depends on the bitrate you encode with. Its one of if not the best video compression algorithm currently out there. You can encode at low bitrate, ie small file with poor quality, or encode 1080p in awesome quality with a file size that is still a fraction of typical HD-DVD.

AidenShaw
Mar 9, 2007, 08:24 PM
encode 1080p in awesome quality with a file size that is still a fraction of typical HD-DVD.

Or the same size as an H.264 Blu-ray or HD-DVD....

matticus008
Mar 9, 2007, 08:27 PM
Please enlighten me further. :)
Does this mean that my iMac (latest model) will (can) have this feature?
If so, how will it work? :confused:
No, not as specified in the article. AVIVO is not a dedicated chip, but rather part of the GPU. By enabling AVIVO, you will certainly get better performance, but that's not evidently the goal. The idea behind this posting is to include a separate chip to do H.264 encoding/decoding independent of the CPU and GPU. This in many ways is just an extension of the multiple cores theory.
That was my thought. Why put a dedicated chip in when you already have a $50 chip in there that does what you want.
For the same reason that you now have two CPUs instead of just one, even though one does what you want. Having a dedicated chip frees up the other subsystems for other tasks. With a separate unit, you could play a game while encoding H.264 in the background--something that you can't pull off too well even with mainstream AVIVO cards.
A $50 chip means a $100 difference in retail price.
Absolutely untrue--there is no single metric for pricing, except perhaps the overall margin, which would make that number about $60, not 100, and would still be a faulty conclusion to draw. Retail price isn't going to change substantially with or without this. It will simply sit in place of a price drop commensurate with its cost. When they switched from dirt cheap G4s to more expensive Core Duo processors in the PowerBook, prices didn't go up.

Eidorian
Mar 9, 2007, 08:27 PM
Or the same size as an H.264 Blu-ray or HD-DVD....Which is amusing since you can then fit a 1080p video reasonably onto a dual-layer DVD then.

vmardian
Mar 9, 2007, 08:37 PM
Apple's margins on hardware are about 40%, so adding a $50 chip to a MacMini is going to increase the retail price by $80. I don't see that happening.

I think the source is simply referring to Intel's next chipset, the G35, which will probably cost $50 in the quantities that Apple buys, and which would probably end up in the MacMini, MacBook, and 17" iMac in Q3/Q4 2007.

Eidorian
Mar 9, 2007, 08:40 PM
http://www.hkepc.com/bbs/hwdb.php?tid=753250&tp=Intel-c2d-e6050&rid=753256

http://www.hkepc.com/bbs/hwdb.php?tid=753251&tp=Intel-c2d-e6050&rid=753256

http://www.hkepc.com/bbs/hwdb.php?tid=753252&tp=Intel-c2d-e6050&rid=753256

SMM
Mar 9, 2007, 08:48 PM
A $50 chip means a $100 difference in retail price. Does anyone really think they're going to raise the price of the Mac Mini by $100 when the Mini can already playback H.264 when not heavily loaded? That's a lot of money for something that is only useful to people like me who rip DVDs.

Unlikely Apple would pay retail price for the chip. Ordered is mass quantity, the price could be $10.

ppc_michael
Mar 9, 2007, 08:56 PM
So will this fix the bug affecting the gamma during playback on Windows computers? No? No. :rolleyes:

topdown5
Mar 9, 2007, 09:09 PM
I'm amazed that nobody has considered the idea that this might lead to TiVo-like features built into Front Row or the Apple TV. Rather than encoding tv into MPEG-2 or the .tivo format, a Mac could encode directly into an iPod friendly H.264 format. We already know that MacOSX can directly control a cable box and pull out live video using a Firewire connection (see Firewire SDK at Apple.com). With this chip, you could connect your Mac to your cable box, pull out live video and have it encoded live into H.264. You could then watch these saved files in Front Row, on your Apple TV, or on your iPod. Sounds brilliantly easy to me.

zimtheinvader
Mar 9, 2007, 10:42 PM
H.264 encoding is prohibitively slow, even on Core Duo machines. I believe Apple has some sort of vested interest in making H.264 the standard of encoding in the upcoming years, (especially since the ipod doesn't play divx which is, imho far superior). If they want to do this they need to make it easier on the machine. Heck, my 5.5G ipod plays H.264 SO much better than my loaded PB, its rather sad, really... however even my Treo plays 800x600 divx flawlessly, and it looks basically the same, at smaller file sizes, and much faster endoding to boot!

I reaaaaally hope the iPhone plays divx at some point (ala mplayer). Otherwise, my PalmOS Treo is superior in this capacity, especially since it can dynamically resize full-screen divx video to fit its small screen.

If Apple makes H.264 encoding easier and faster, though, I think it can begin to assert itself in the mobile video market, as iTunes easy CD ripping did for music, and this incentive may cause Apple to initially underwrite the cost of chip inclusion...

Avatar74
Mar 9, 2007, 11:25 PM
http://www.macrumors.com/images/macrumorsthreadlogo.gif (http://www.macrumors.com)

Robert Cringley claims (http://www.pbs.org/cringely/pulpit/2007/pulpit_20070308_001806.html) that Apple is planning on incorporating dedicated H.264 decoding chips into future Mac hardware.

The article claims that incorporating a dedicated H.264 decoding chip will allow Apple to ensure the same base performance on every machine it sells. The $50 chip is said to also offer H.264 encoding to allow users to quickly encode high quality video clips for upload to the internet.

Cringely is widely regarded as nothing more than an attention whore who makes outlandish (and frequently WRONG) predictions just to stir up controversy (and site hits).

It doesn't make sense to incorporate such a chip because it wastes the modularity of computer hardware which can facilitate future codecs of any kind via software. The answer isn't in a dedicated ASIC for H.264 that will be entirely obsolete within a year... at least not for Apple. Good industrial design, not preprogrammed obsolescence, is what adds value to Apple products. In this case, good industrial design means incorporating more powerful multipurpose CPU and GPU chipsets as opposed to dedicated encoder/decoder chips.

Keep blindly throwing darts at that board, Cringely... eventually one of them will hit a bullseye by sheer luck.

Mtn Tamale
Mar 9, 2007, 11:56 PM
No way is Apple going to put in a $50 dedicated chip for all the most intelligent reasons stated above. This is probably just a strategic leak to help push ATI into lowering their royalty costs to unlock AVIVO for Apple. Kind of like reverse FUD.

IJ Reilly
Mar 10, 2007, 12:04 AM
I was addressing that, by saying that H.264 /hardware/ encoding/decoding at this point is already 'too expensive' to be worth doing. H.264 at 1080p decoding can be done /now/ in software in Apple's entire current line. That leaves encoding as the main benefit. Already, h.264 can encode at realtime on some of Apple's product line.

For Apple to be willing to drop 50$ on an h.264 chip, they either are expecting to shove HD-DVD or Blu-Ray into the entire line (not likely, I am not sure people want a 1000$ mini with 512MB of RAM and no GPU)... or they are expecting to do something HUGE with encoding (Media center competition?). By the time the chip starts doing good things for Apple's sales, software will be able to nearly overtake the hardware, and they take it back out again.

Cringely's musings, while entertaining, have really made me wonder why I looked up to the guy when I was younger. His point that this chip is superior to a software solution banks on this 'cheap' chip doing things better than a CPU, but misses the point of this cheap, specialized chip in the first place. A DVD player needs to be cheap, and you can't go around putting a 200$ CPU into it just to play DVDs... you build a custom DSP for DVDs, along with a 10$ CPU, and save more than 50% on your costs. A Computer is already expensive, so using a bunch of specialized chips doesn't make much sense, when the 200$ CPU /can/ do it all.

I think it make sense if you're trying to avoid taxing the CPU if it does other things while it's encoding and decoding. It depends on how serious Apple has become about the Mac as a media center.

Krevnik
Mar 10, 2007, 12:18 AM
I think it make sense if you're trying to avoid taxing the CPU if it does other things while it's encoding and decoding. It depends on how serious Apple has become about the Mac as a media center.

It also assumes that the Mac will become a TiVo. I don't see that. Microsoft doesn't even see that, but it lets you use Media Center as a TiVo because the real show isn't ready yet.

With a good implementation of IPTV, using H.264, I can see the Mac not even needing this chip, and Apple TV being able to access your next-gen Comcast subscription without the Mac as a hub (unless you want to record, maybe, and even then, it is just saving a file to disk, rather than any real encoding job).

The future is that the DVR as we see it doesn't exist. Instead, you have internet media streaming boxes (using bandwidth freed up because of the migration from old-style QAM digital cable, to IP services), both in the classic form we are used to (to avoid making people uncomfortable), but also in the form of cheap HTPCs like the Mini, the Apple TV, and even cheap PCs that don't need cable cards, hardware encoders, and so on.

One link, multiple services... IP for phone, the web, and media. Cable companies have the infrastructure in place already to provide media, it is just locked up in older cable bandwidth (A Comcast 6Mbps internet link only takes up one HD channel worth of bandwidth, so think of the local link speeds Comcast could provide to customers for media streaming of Comcast over IPTV).

nuckinfutz
Mar 10, 2007, 12:31 AM
Cringely is widely regarded as nothing more than an attention whore who makes outlandish (and frequently WRONG) predictions just to stir up controversy (and site hits).

It doesn't make sense to incorporate such a chip because it wastes the modularity of computer hardware which can facilitate future codecs of any kind via software. The answer isn't in a dedicated ASIC for H.264 that will be entirely obsolete within a year... at least not for Apple. Good industrial design, not preprogrammed obsolescence, is what adds value to Apple products. In this case, good industrial design means incorporating more powerful multipurpose CPU and GPU chipsets as opposed to dedicated encoder/decoder chips.

Keep blindly throwing darts at that board, Cringely... eventually one of them will hit a bullseye by sheer luck.

I think you're a bit off here. Cringley isn't taking credit for this but rather speculating on rumors that he's heard.

Now comes the rumor I have heard, that I believe to be a fact, that has simply yet to be confirmed. I have heard that Apple plans to add hardware video decoding to ALL of its new computers beginning fairly soon, certainly this year.

Modularity of computer hardware means little with today's OS X. Developers no longer write to the metal. Everything is abstracted to the point where Apple can keep OS X as portable as possible. Look at the iPhone which runs a reduced subset of OS X for example. In fact if you look at the new QTKit which is replacing the older 32-bit Quicktime framework you'll find that it is well insulated from the hardware. So Apple in fact can add a dedicated encoder/decoder chip and the process of making quicktime use the chip shouldn't be too difficult.

If good industrial design is adding more powerful CPU and GPU then why is AMD persuing their Torrenza platform which has just what Apple is rumored to be doing here. Plug in co-processors modules linked with Hypertransport.

http://en.wikipedia.org/wiki/Torrenza

The idea makes perfect sense if the dollars match up well. Apple wants to sell HD content via iTunes. They need a low bandwidth CODEC...check h.264/AVC. Now they need a solid way to play back said content without requiring a Mac Pro.

They're not going to put in discrete graphics in Mac mini and Macbooks so forget Avivo or Purevideo. If they can put in a dedicated co-processor for AVC encode/decode a world of options open up.

iLife 07- iDVD and iMovie handle AVC based codecs much easier.
Leopard- iChat AV and QTKit handle AVC encode/decode much better
Pro- Final Cut Studio, Compressor all handle AVC easier

Yes you can use the GPU but you still have finite power per watt. Leopard is going to be using the GPU much harder than tiger with OpenGL 2.1 being multithreaded and AVC being used even more. Apple's going to need some efficient compression. They are claiming "significant" improvements to AVC encode in QTkit but we're going to need more.

This may just be a rumor but it certainly makes sense and OS X allows this to happen quite easily IMO.

EricNau
Mar 10, 2007, 12:33 AM
If this is true, I'm hoping it's because Apple will introduce "Media Center" capabilities into OS X, allowing users to record and stream television shows from their computer to :apple:tv.

Or... Perhaps allow users to legally rip DVDs into iTunes (although I'm doubting this for two reasons: right now it's illegal, and it might affect iTunes movie sales).

AppleMan101
Mar 10, 2007, 03:22 AM
Apple's margins on hardware are about 40%, so adding a $50 chip to a MacMini is going to increase the retail price by $80. I don't see that happening.

{SNIP}
...and nothing else in any of the computers hasn't been dropping in price since the products were first released? $50/£25 in costs isn't going to effect final prices or profit margins significantly IMO.

syriana
Mar 10, 2007, 03:27 AM
This might just be me being stupid but $50 (or €50 as it's going to be converted into :mad: ) is a bit of a price to pay for faster ripping of DVD's... or is there more to this that i'm missing???

HD iChat?? with a new isight??

spicyapple
Mar 10, 2007, 04:35 AM
Didn't Cringely suggest Apple price down the Mac minis to a schweet price point to invade the livingroom. That sorta happened with AppleTV. Think about it. Nearly the same form factor. Low price. Front-Row capable.

H.264 is the cornerstone of Apple's home invasion strategy, so it makes good sense to include a cheap yet powerful dedicated encoder/decoder chip.

Cringely has a whole bunch of wild ideas, but it's the crazy people like him who can see the future clearly.

Rod Rod
Mar 10, 2007, 06:21 AM
The new ATi chips have H.264 decode/encode built in (AVIVO). All this means is that all the Mac's will be getting ATi chips (or the nVidia equivalent). There won't be a dedicated chip other than that.

I don't get what he thinks the big hoopla is about. Anyone with half a brain could have predicted this. (Except maybe in the Mini or Macbook which don't have ATi chips but may in the future have them)Assuming it did MPEG2 as well as h264 (not unlikely if it happened) it would benefit iDVD
IIRC the ATI cards can even do hardware encoding (well ATI were working on this a while back, but haven't heard much since).

The ATi Mobility Radeon 9600 (http://www.anandtech.com/printarticle.aspx?i=1802) has MPEG-2 hardware encode acceleration. Even the nVidia FX 5200 has an MPEG2 encode assist engine (scroll further down the same link).

The ATi Radeon 9600 and 9700 were in every single 15" PowerBook G4, and in all but the first revision of the 17" PowerBook G4. The FX 5200 was in every single 12" PowerBook G4 except the first revision.

One the desktop side, the Radeon 9600 shipped in heaps of Power Mac G5s, iMac G5s and even the final version of the eMac. The FX 5200 shipped in heaps of Power Mac G5s, iMac G5s, and the last revision or two of the iMac G4.

Altogether that represents millions of Macs. Apple never enabled hardware-accelerated MPEG-2 encoding on the 9600/9700/5200. Therefore it's neither obvious nor a foregone conclusion that Apple will necessarily avail hardware features in GPUs of the present or future. So far it hasn't happened. It would be nice if that were to change.

So far, Apple has seen fit to leave MPEG-2 decoding and encoding in software and on the CPU(s).

I'm amazed that nobody has considered the idea that this might lead to TiVo-like features built into Front Row or the Apple TV.It's not so amazing, because Cringley talks about exactly that in his column, which is conveniently linked (http://www.pbs.org/cringely/pulpit/2007/pulpit_20070308_001806.html) in the original post (http://forums.macrumors.com/showpost.php?p=3427851&postcount=1).

TBi
Mar 10, 2007, 06:53 AM
Altogether that represents millions of Macs. Apple never enabled hardware-accelerated MPEG-2 encoding on the 9600/9700/5200. Therefore it's neither obvious nor a foregone conclusion that Apple will necessarily avail hardware features in GPUs of the present or future. So far it hasn't happened. It would be nice if that were to change.

So far, Apple has seen fit to leave MPEG-2 decoding and encoding in software and on the CPU(s).


Maybe because it's too lazy to write the drivers or that it has seen no need to.

Anyway if you did your homework you'd see that you need the ATi Rage Theatre chip as well. That's the chip that does the mpeg2 encoding.

Personally though i don't want to spend 50 quid for an extra chip in my Mac when i know the graphics card will do it for me. Plus i know apple is all about margins so i don't think they'll put that powerful card to waste.

Manic Mouse
Mar 10, 2007, 07:19 AM
Nvidia is launching it's mobile 8 series GPUs with Santa Rosa, has anyone thought of the possibility of Apple using them? The new unified shader architecture can be used for way more than simply pushing pixels and would be ideal for grinding those H.264 files and freeing up the CPU.

This move also points to the possibility of DVD ripping in iTunes, like CDs. I don't see why this isn't already a feature with all the emphasis Apple is putting on Movies/TV shows. Especially since if you live outside the US you don't have access to either on the iTunes store (making things like Video iPods and the :apple: TV pretty much useless).

Digitalclips
Mar 10, 2007, 07:25 AM
Not sure where the best place to have this discussion is, but Cringley is at it again, this time speculating (well, hinting that he knows) that Apple will be putting h.264 encoding/decoding hardware in all Macs as soon as this year.

Here's the article:
http://www.pbs.org/cringely/pulpit/2007/pulpit_20070308_001806.html

Sorry if this has been covered already , but this may also have iPhone and iChat potential conferencing implications. I suspect Macs and iPhones will be able to work together seamlessly in this area soon.

What's the betting they don't name this chip 'Alitivec ][' ;)

Rod Rod
Mar 10, 2007, 07:33 AM
Maybe because it's too lazy to write the drivers or that it has seen no need to.

Anyway if you did your homework you'd see that you need the ATi Rage Theatre chip as well. That's the chip that does the mpeg2 encoding.

Personally though i don't want to spend 50 quid for an extra chip in my Mac when i know the graphics card will do it for me. Plus i know apple is all about margins so i don't think they'll put that powerful card to waste.

Supposing that you're right about the encoding happening on some marketing-speak "Rage Theatre" chip (which whatever it is appears to be built into the 9600 based on the Anandtech article (http://www.anandtech.com/printarticle.aspx?i=1802)), since the ATi 9000 or 9200 there has been hardware accelerated MPEG-2 decoding. Apple's never availed that either.

To the second point, that's been asked and answered with respect to the Mac mini / MacBook lines. Economies of scale take hold at some point and who knows, it could become a relatively minor component cost.

Digitalclips
Mar 10, 2007, 07:34 AM
Yes they do and I am waiting for ATI Radeon X2800XT Blue-Ray equipped Mac Pro

Oh yes!:cool: ... drool .... and that's Blue-Ray BURNER equipped!

Jowl
Mar 10, 2007, 07:41 AM
I don't really see much point in the mac mini for example...

I would love one in mine......as its a media centre (or will be) it will help encode my TV recordings and DVD quicker

Rocketman
Mar 10, 2007, 09:17 AM
To me, this is like Apple adopting the Amiga philosophy a few decades late. Amiga worked around the limits of chips of their day by having several specialized chips. One for graphics, one for sound, General CPU, etc. Apple has mainly used general purpose chips, combined with feature rich operating systems. That's why Macs were "slow" for so many years. Yes they were better and you could get to work sooner and were inclined to work more complex tasks, but the computer itself was always making you wait for something.

Times have changed and even the general purpose chips now rarely make you wait on consumer level tasks. Adding specialized chips for heavy processing tasks is smart, so at least we do not go back in time and become familiar with the NEW watch.

Rocketman

Hattig
Mar 10, 2007, 09:21 AM
A dedicated chip can help in two ways:

In mobile devices: cutting down power consumption (1W instead of 35W) and leaving the CPU free

In desktop devices: vastly increasing the performance and leaving the CPU free

i.e., you get a lot more performance per Watt with a dedicated chip than by using the CPU, and you can also get much greater performance.


Personally, however, I think that Apple will be utilising unified shaders in the GPUs in their systems to aid in decoding and encoding. This may mean that AMD/ATI will be supplying graphics chips across the board, possibly even chipsets if Intel's integrated chipsets aren't suitable for this task.

sigamy
Mar 10, 2007, 09:41 AM
This is great news. As Blue Ray and HD-DVD battle it out Apple is going to come in and steal the show. The future of High Def delivery is not going to be disk based, it will be downloads.

The new disk formats will be great for archival but that's all. The next Apple TV will have at least a 100gb HD to allow for more storage of high quality
H.264 content.

Digitalclips
Mar 10, 2007, 09:42 AM
To me, this is like Apple adopting the Amiga philosophy a few decades late. Amiga worked around the limits of chips of their day by having several specialized chips. One for graphics, one for sound, General CPU, etc. Apple has mainly used general purpose chips, combined with feature rich operating systems. That's why Macs were "slow" for so many years. Yes they were better and you could get to work sooner and were inclined to work more complex tasks, but the computer itself was always making you wait for something.

Times have changed and even the general purpose chips now rarely make you wait on consumer level tasks. Adding specialized chips for heavy processing tasks is smart, so at least we do not go back in time and become familiar with the NEW watch.

Rocketman

To reinforce the specialized chip concept you only have to look back at cards like those from Matrox. In mid 90's I edited a season of TV shows for ESPN on a Quadra 840 with full real time editing, cross fades and many special effects. No rendering and straight to tape for broadcast. All done by a card plugged in the PCI slot. The Quadra alone could barely play a quick time movie! I want a similar card, (built in by Apple not 3rd party, Matrox charged an arm and a leg for the software) that even handles 1080i HD, well i can dream right? Or maybe that's what's coming at NAB :)

kLy
Mar 10, 2007, 10:35 AM
It does look like AVIVO has encoding capabilities too, if you have a look here:

http://ati.amd.com/technology/avivo/technology.html

Apparently, it's pretty quick too:

ATI’s Avivo Video Converter can take a 30 minute recorded show, and convert it into a format playable by an iPod in less than 5 minutes. It can cut the conversion time by 80% or more.

Though what I'm not too certain about is whether this is the same AVIVO hardware that's included in the X1600 or if the AVIVO converter is actually just a piece of software that's been fudged together with the hardware in the marketing under "AVIVO"

Stridder44
Mar 10, 2007, 10:35 AM
This is great news. As Blue Ray and HD-DVD battle it out Apple is going to come in and steal the show. The future of High Def delivery is not going to be disk based, it will be downloads.

The new disk formats will be great for archival but that's all. The next Apple TV will have at least a 100gb HD to allow for more storage of high quality
H.264 content.


We'll see about that. Not saying it can't happen (It probably will fairly soon) but for now I'm betting that disk formats are here to stay for a while.


People like to have their content mobile. Music worked so well because you have the iPod. People don't really have anyway to move their movies around (say to a friends house) unless it's on a DVD (or some disk format). iPods don't count here, as they arn't DVD quality, nor does a group of people want to hook it up to their TV (if they don't have to). It's way simpler to pop in a DVD instead.


One day though, this'll change. We just gotta have the right device to do it.

Durendal
Mar 10, 2007, 10:39 AM
Cringley is only half a rung above Dvorak for idiotic trolling. This is just more BS. All Macs currently sold can do 1080p decoding without skipping a beat, so why in blazes would Apple put in a dedicated hardware decoder? They're already offloading it to the video card as much as they can. There's no need for one. Performance baseline my foot. I don't remember Apple ever using a hardware encoder for MPEG-2 (Hardware decoding, yes, but that was before they figured out how to do it in software without a hiccup, and they killed support for them in OS X), so why would they do it now? Hardware H.264 decoding is done in video, and encoding will aided by it very soon. The big boys will still buy hardware encoders for their Mac Pros, but I don't see Apple doing it anytime soon. Cringley is once again smoking crack.

IJ Reilly
Mar 10, 2007, 10:58 AM
An interesting discussion. Thanks for all the new info! :cool:

mccoma
Mar 10, 2007, 11:33 AM
I'll list some points then get to my take:

Apple has been already been working with NVIDIA on iPod chips
AMD bought ATI
AMD is listed as a takeover target
Apple seems pretty set with Intel
Apple has been pushing a lot of processing into the GPU. To the point that Lightroom might outperform Aperture on systems with more Intel cores.
Cringely has this rumor of encode hardware for H.264
a cut version of OS X is rumored to be migrating to iPod (already iPhone)
Cringely listed the added cost as $50 per unit


I gotta wonder if Cringely's rumor is a shadow of the real story.

I guess, like a lot of people, I had hoped the PowerPC could keep up simply so Apple hardware would have an edge due to it being built for Apple's needs. I had no idea a little company like Parallels would come along and change the whole game. So, the Intel change has been a good thing for getting people to buy a Mac. Surprisingly, I have watched people migrate in pieces so Parallels is barely running (oh, I should surf the web and get mail on the Mac side since it's safer). It looks like the next generation of Intel hardware will be lower wattage and more cores.

Apple has been using the GPU to do a lot of acceleration (Core Image/Video/Animation). It would seem to cut down on Apple's costs for software development to get the needed programming down to as few chips as possible (Software costs a lot to develop). Also, GPU-type operations can be parallelized. Video encoding is something Apple is very interested in. Particularly if the encoding can be done in realtime at high resolutions. I guess I really wonder if Apple is getting NVIDIA to build it a custom GPU built for Apple's needs (low power, encoding, Core *). Put 1 GPU in the mini and 2 or 4 in the Mac Pro. Use the same GPU across the line and just increase the number for the more expensive systems. Apple is selling a lot more boxes these days and this could be pretty viable for a GPU maker.

I guess this can be chalked up to me dreaming, but it does seem kinda logical.

PS: Anyone think of how the settlement with Apple Corp will change the kind of sound hardware that can be incorporated into the Mac?

shyataroo
Mar 10, 2007, 01:17 PM
its obvious that apple is going to use this so you can convert your movies into iTunes format and upload them to iTunes as part of a new service that lets you download user made movies (of course, protected stuff won't be allowed, and copyright stuff won't either how they ar eoging to do that I dont know)

nuckinfutz
Mar 10, 2007, 02:26 PM
Cringley is only half a rung above Dvorak for idiotic trolling. This is just more BS. All Macs currently sold can do 1080p decoding without skipping a beat, so why in blazes would Apple put in a dedicated hardware decoder? They're already offloading it to the video card as much as they can. There's no need for one. Performance baseline my foot. I don't remember Apple ever using a hardware encoder for MPEG-2 (Hardware decoding, yes, but that was before they figured out how to do it in software without a hiccup, and they killed support for them in OS X), so why would they do it now? Hardware H.264 decoding is done in video, and encoding will aided by it very soon. The big boys will still buy hardware encoders for their Mac Pros, but I don't see Apple doing it anytime soon. Cringley is once again smoking crack.

I don't know If I agree with the "1080p without skipping a beat". My mini certainly can't decode AVC at 1080p without a few stutters here and there. Cringley explained exactly why Apple would do such a thing. It creates a baseline of performance for AVC encoding and decoding in EVERY Mac. Sure you could add a more expensive GPU but show me a GPU that consumes a single watt of power that can encode 720p/24 AVC video like the 3Dlabs DMS-02 can. Frankly I'd rather there be a dedicated chip for the encoding/decoding task so that my GPU can be fed with the task of User Interface rendering.

I really haven't heard many plausbile reasons why Apple couldn't or shouldn't do this that don't involve slandering Cringley. I'd rather read Cringley's "pie in the sky" stuff rather than a bunch of posts from people that can't think outside the box.

IJ Reilly
Mar 10, 2007, 02:51 PM
I really haven't heard many plausbile reasons why Apple couldn't or shouldn't do this that don't involve slandering Cringley. I'd rather read Cringley's "pie in the sky" stuff rather than a bunch of posts from people that can't think outside the box.

Agreed. Cringely may not always be right (by any means), but he's always intriguing, and far better informed on the baseline technical issues than 90% of his compatriots in tech journalism. It's becoming a rare treat to read a column from somebody who is actually thinking instead of simply regurgitating press releases.

icrude
Mar 10, 2007, 03:26 PM
very smart...the mac mini is used just as much for multimedia computer as the mac pro is....so let's let them both be able to play back high end 264, such as HD DVDs.

kresh
Mar 10, 2007, 09:11 PM
If they add a hardware chip then isn't this the same thing as a pci encoder card for real-time encoding?

What I mean is that almost every hardware encorder is for real-time broadcasting (3mbs).

For really great quality then a software encoder does a much better job. With software you can have 2-pass encoding with rate correction. This is something I have never seen with hardware encoding.

This chip would add benefit for decoding and watching streamed video, but I am interested in speeding up Handbrake and such.

Just my $0.02

nuckinfutz
Mar 10, 2007, 09:26 PM
If they add a hardware chip then isn't this the same thing as a pci encoder card for real-time encoding?

What I mean is that almost every hardware encorder is for real-time broadcasting (3mbs).

For really great quality then a software encoder does a much better job. With software you can have 2-pass encoding with rate correction. This is something I have never seen with hardware encoding.

This chip would add benefit for decoding and watching streamed video, but I am interested in speeding up Handbrake and such.

Just my $0.02

Leopard is going to "significantly" improve h.264 encoding and now Quicktime will support Transparent Alpha Channels. The issue though is that even in the consumer space we're moving to HD and Apple's Champion is h.264/AVC which on avg is 8x more difficult to encode as compared to MPEG2. I guess it's a matter of choosing to utilize the GPU more or look at a dedicated media processor. The Apple TV only supports MPEG4 video..a severe limitation but if Apple can improve transcode speed then near realtime transcoding and delivery can make most QT supported codecs playable on the ATV.

Cult Follower
Mar 10, 2007, 09:40 PM
I really don't see the point in the more consumer macs, un less they are going to really push h.264 in leopard.

nuckinfutz
Mar 10, 2007, 10:19 PM
I really don't see the point in the more consumer macs, un less they are going to really push h.264 in leopard.

iTunes movie downloads- h.264
iChat Theater - h.264
Quicktime improvements- h.264
Apple TV support- MPEG4 which includes h.264

h.264 is very important to Apple. It's the codec of choice.

Highland
Mar 10, 2007, 10:42 PM
This guy couldn't read the ingredients off of a box of cereal and get it correct. A chip cost cited at $50 is meaningless, Apple has huge buying power and can negotiate prices way, way down below even wholesale prices (buying RAM from Apple is one of biggest rip-offs ever).

Furthermore Apple isn't going to simply "eat" the cost, it will be figured in somehow with reduced costs of some other component or amortized over time as with every other product roll out.

This may or may not happen, but with this guy place your money on it NOT happening (the way he described it, at least).
I agree.

Also... how will this aid PVR or media center functionality? My Mac Mini and EyeTV doesn't need to do ANY video encoding... The digital TV stream is already compressed. The EyeTV software just captures it and saves it to disk.

The only reason I can see this happening is if Apple decide to turn iTunes into a DVD ripping and library application as well (which is possible, but doubtful).

milatchi
Mar 11, 2007, 12:19 AM
If it's true, this is pretty cool.

MrCrowbar
Mar 11, 2007, 12:37 AM
Come ti think of it, a dedicated processor for H.264 encoding/decoding would be very welcome. My generation 1 Macbook gets really busy when playing 720p content, especially when I have all those "always on" apps like Mail, Firefox, iChat, Skype, iTunes, iCal, running all the time.

I'm totally satisfied with the processing power of the Macbook, I had to pimp out the hard drive and RAM to the maximum to fit my needs tho. The only thing that is painfully slow is when I encode my home movies into H.264 (my camera only record uncompressed, i.e. 8 minutes = 1 GB). It just takes hours and the fans are on max all the time. With that chip, the CPU would be free for other useful stuff and, considering that chip only takes 1 Watt, the whole thing would be much more quiet.

It's also annoying to have videos skip a few frames because some other app decides to eat up all of the CPU power for a millisecond.

twoodcc
Mar 11, 2007, 01:29 AM
i hope that this is true. i guess we can't add this to our current macs though?

Scottgfx
Mar 11, 2007, 01:29 AM
To me, this is like Apple adopting the Amiga philosophy a few decades late. Amiga worked around the limits of chips of their day by having several specialized chips. One for graphics, one for sound, General CPU, etc.

It's just that today, the GPU's and North and Southbridge chips work pretty much like the old Amiga chipset. The CPU of today also has more cycles to spare then it did back then.

I was thinking anlong the same lines as you when I read the report but I had a different take. After the Amiga 3000 was released, they were working on the Amiga 3000+. The system engineer, Dave Haynie, had an AT&T DSP on the motherboard. Depending on the configuration, the DSP could become an audio processor or a full modem. Sadly, Commodore was too short sighted to actually bring this to market.

I recommend a book called "On The Edge". It's a pretty faithful account of the early years of computing, without the Apple bias. (Please note that I'm a Mac user and Apple shareholder. But I know the history and know that Apple wasn't the only one out there back in the day.)

dude-x
Mar 11, 2007, 03:29 AM
H.264 is used in Blu-Ray movies, and all the hi-def (and standard def) quicktime stuff. It would be nice if Apple includes a competent h.264 decoder that's better than what NVIDIA/ATI provides, because their solutions don't use any advanced features to improve the image for HD content.

On another note, someone in the know blurbed out that the next Apple Cinema HD displays will support HDMI (presumably spec 1.3a) and wide gamut colors (supports 10 bit per channel), and they should be available Real Soon Now(tm).

These are exciting times...

guzhogi
Mar 11, 2007, 08:34 AM
So a 50$ chip will be faster than the core 2 duo monsters?

Probably at encoding/decoding h.264. The C2D is a general purpose chip: it's not really optimized to do any one specific thing. This $50 (or however much) chip is probably specifically optimized to encode/decode h.264, but can't do other stuff. It's basically a trade-off: do you want something that does everything at a an ok speed or something that 1 thing, but blazing fast?

This reminds me of one time, I was thinking about OpenGL, and OpenAL and wondered what other Open*Ls there were. I tried every other letter of the alphabet and found openrl, an API designed to help make movie software, which was designed for this add-on card by Aspex Semiconductors. This card sounds like it does the exact same thing. Check it out at www.aspex-semi.com

gnasher729
Mar 11, 2007, 08:50 AM
http://ati.com/technology/Avivo/index.html

Yeah Avivo is in there. It just needs to be enabled. Why put additional hardware when the GPU is there waiting to be used to encode/decode. I know it took some time for Windows get get a driver for it.

Most of the talk that I've heard is that it is ridiculous to go beyond what the GPU does.

I checked out the ATI site. An X1600 is supposed to handle up to 720p. However, that isn't hardware decoding, it is "hardware assisted" decoding. CPU usage doesn't go from 80% to 0%, it goes from 80% to 60%. The DMS-02 does 720p on its own, no CPU usage at all (well, it includes its own dual core CPU), and uses 1 Watt of power. That's something you can put into a MacBook or even an ultraportable.

TBi
Mar 11, 2007, 10:57 AM
I checked out the ATI site. An X1600 is supposed to handle up to 720p. However, that isn't hardware decoding, it is "hardware assisted" decoding. CPU usage doesn't go from 80% to 0%, it goes from 80% to 60%. The DMS-02 does 720p on its own, no CPU usage at all (well, it includes its own dual core CPU), and uses 1 Watt of power. That's something you can put into a MacBook or even an ultraportable.

That's why they are going to use the newer Chips to do the encoding. Because they have full hardware decoding.

psychofreak
Mar 11, 2007, 10:59 AM
This reminds me of one time, I was thinking about OpenGL, and OpenAL and wondered what other Open*Ls there were. How bored were you?!

Lanbrown
Mar 11, 2007, 01:49 PM
The new ATi chips have H.264 decode/encode built in (AVIVO). All this means is that all the Mac's will be getting ATi chips (or the nVidia equivalent). There won't be a dedicated chip other than that.

I don't get what he thinks the big hoopla is about. Anyone with half a brain could have predicted this. (Except maybe in the Mini or Macbook which don't have ATi chips but may in the future have them)

Intel could add it to their pfferings as well. Some of the machines use the integrated Intel graphics. Apple could be looking at going with one supplier though; either ATI or Nividia and getting rid of Intel for the GPU on the lower machines.

gnasher729
Mar 11, 2007, 02:53 PM
That's why they are going to use the newer Chips to do the encoding. Because they have full hardware decoding.

Do they? And how do they fit into a notebook?

TBi
Mar 11, 2007, 02:54 PM
Intel could add it to their pfferings as well. Some of the machines use the integrated Intel graphics. Apple could be looking at going with one supplier though; either ATI or Nividia and getting rid of Intel for the GPU on the lower machines.

The graphics card basically comes free with the northbridge. However what they could do is use the new ATi/AMD northbridge which has a built in Radeon X1250 (or similar). Personally i hope they stay with Intel for the northbridge though.

matticus008
Mar 11, 2007, 04:47 PM
That's why they are going to use the newer Chips to do the encoding. Because they have full hardware decoding.
But still not 0% CPU utilization like a purpose-built dedicated chip. This is why even though graphics hardware has had hardware MPEG2 decoding/encoding for several years now, until recently, MPEG cards were not uncommon in graphics machines. AVIVO is not the dedicated chip referenced by the article.

That's not to say that the article is correct, but it would specifically be a dedicated chip and NOT part of the GPU.

TBi
Mar 11, 2007, 05:29 PM
Do they? And how do they fit into a notebook?

They just use a different graphics card

But still not 0% CPU utilization like a purpose-built dedicated chip. This is why even though graphics hardware has had hardware MPEG2 decoding/encoding for several years now, until recently, MPEG cards were not uncommon in graphics machines. AVIVO is not the dedicated chip referenced by the article.

That's not to say that the article is correct, but it would specifically be a dedicated chip and NOT part of the GPU.

I have never, ever, encoded anything on my Mac. So if i had the choice between a better graphics card which helps encoding and a dedicated chip i'll go with the better graphics card anyday.

I don't want to encode H.264, i don't want to pay extra for the priviledge of doing so. Not unless it comes in a better graphics card which lets me play games and do 3D better.

psychofreak
Mar 11, 2007, 05:41 PM
I don't want to encode H.264, i don't want to pay extra for the priviledge of doing so. Not unless it comes in a better graphics card which lets me play games and do 3D better.

My thoughts exactly, but Apple has a tendency of doing things like this; e.g. not many people at all use Front Row on a Macbook's small screen, but the remote is included, upping the price...

TBi
Mar 11, 2007, 06:22 PM
My thoughts exactly, but Apple has a tendency of doing things like this; e.g. not many people at all use Front Row on a Macbook's small screen, but the remote is included, upping the price...

You're right, although i'd be complaining more about the speakers... you wouldn't be able to hear it if you were any distance away from it.

However i do use my remote a lot with external speakers and an external monitor if i'm doing more than listening to music. It's very good for parties too (not that it is a good idea to have an expensive laptop out in the middle of a group of drunkards :) ).

guzhogi
Mar 11, 2007, 08:04 PM
How bored were you?!

Pretty bored. :p Besides, I was a CS major in college & this was an interesting topic.

matticus008
Mar 11, 2007, 08:24 PM
I have never, ever, encoded anything on my Mac. So if i had the choice between a better graphics card which helps encoding and a dedicated chip i'll go with the better graphics card anyday.
If your desire is to improve performance in the living room and multimedia operations, then the dedicated chip is better. If you're a professional user (the target of the higher-end models), there's a strong chance you do need to encode properly. If you're a consumer, there's a strong chance that you'll be using H.264 encoding on your television, for your future iPod, or with your video camera. If you have no interest in any of these things, then simply ignore it.
I don't want to encode H.264, i don't want to pay extra for the priviledge of doing so. Not unless it comes in a better graphics card which lets me play games and do 3D better.
First of all, the two are not mutually exclusive. Better graphics cards are completely irrelevant to the hypothetical proposal of dedicated H.264 hardware (which is as much about decoding as encoding). More to the point, however, the prices of Apple products don't fluctuate. They set price points and they include hardware based on those parameters. An iMac is going to be $999 either way. A dedicated chip might be better put to use with a more expensive overall GPU in your view.

In the view of much of the public, however, the cost of OS X would be better put to use in dropping prices so they could run Windows on a pretty Mac. On the flip side of that same coin, there are plenty of Mac users who wish all the money spent (and cost incurred) on the little features and stylish design went into more expensive video cards or bigger hard drives or more RAM. "Who needs magnetic latches or illuminated keyboards or aircraft-grade aluminum enclosures? Put a [hotshot gamer video card] in instead!"

Of course, if this idea does happen, there will be a dozen threads about how Apple missed a big opportunity and how their products are just outrageous and unacceptable. Never mind the reality that all the people wanting real media center capabilities will be thrilled, and that Apple sales would remain strong, and the move will inevitably be copied by other manufacturers.

aLoC
Mar 12, 2007, 05:44 AM
Maybe a better solution would be to go cold turkey and stop using H.264 which is bloated junk to begin with.

You can get quite nice looking movies using the existing (Sorenson?) Quicktime codec, and the files are not a huge as people make out.

Just choose a resolution that is slightly lower than 1080i but still noticeably (to the average person) higher than 640x480.

fraggle
Mar 12, 2007, 09:33 AM
This implies something more than upgrading the GPU, since both the mini and the MacBook don't have a GPU.

They both do, it's just a not-so-fast one without dedicated memory.

Krevnik
Mar 12, 2007, 11:38 AM
Maybe a better solution would be to go cold turkey and stop using H.264 which is bloated junk to begin with.

You can get quite nice looking movies using the existing (Sorenson?) Quicktime codec, and the files are not a huge as people make out.

Just choose a resolution that is slightly lower than 1080i but still noticeably (to the average person) higher than 640x480.

Sorenson doesn't play on anything other than Quicktime and ffmpeg. Not to mention QT has about 30 different codecs that can be used. The only portable ones are H.264 and MPEG-4.

I don't even see the point in adding chips for decoding in the first place. It isn't like MPEG-4 and H.264 have huge decoding issues (like MPEG-2 had back when DVD drives were being added to laptops and desktop).

For encoding, I can kind of see the point, but then again, you also have most of Apple's line already capable of realtime encoding. Not sure what the gain will be there.

(As for the bloated junk comment, compression isn't exactly as clear-cut as people think. To get better results without sacrificing apparent quality, more 'work' needs to be applied during the encoding/decoding process. There is already work in wavelet codecs which look better than H.264 and MPEG-4 at the same filesize, but they are even more CPU intensive)

gnasher729
Mar 12, 2007, 12:33 PM
Maybe a better solution would be to go cold turkey and stop using H.264 which is bloated junk to begin with.

If you think that h.264 is "bloated junk" then I seriously suggest that you don't have the slightest clue what you are talking about.

Diatribe
Mar 12, 2007, 02:33 PM
So, this could be a way for Apple to incorporate DVD ripping into iTunes...


Now that would be pretty awesome, although I definitely don't see them doing that.

Highland
Mar 12, 2007, 09:04 PM
If you think that h.264 is "bloated junk" then I seriously suggest that you don't have the slightest clue what you are talking about.
Hahaha. Yeah. I agree.

And even *if* H.264 is junk (it's definitely NOT), all the legitimate new media seems to be using it (iTunes/HD-DVD/Blu-ray/etc).

What a strange thing to say.