Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
All Intel Macs (with Nvidia and ATI graphics cards) should already have hardware H.264 decoding. IIRC the ATI cards can even do hardware encoding (well ATI were working on this a while back, but haven't heard much since).

So it should just be a matter of enabling those features.

That was my thought. Why put a dedicated chip in when you already have a $50 chip in there that does what you want.

I don't think the X1600 had the encoder built in but the newer models do. It's just a matter of turning it on. However if it is built into the graphics cards then the faster the card, the faster the encoding.
http://ati.com/technology/Avivo/index.html

Yeah Avivo is in there. It just needs to be enabled. Why put additional hardware when the GPU is there waiting to be used to encode/decode. I know it took some time for Windows get get a driver for it.

Most of the talk that I've heard is that it is ridiculous to go beyond what the GPU does.
 
All Intel Macs (with Nvidia and ATI graphics cards) should already have hardware H.264 decoding. IIRC the ATI cards can even do hardware encoding (well ATI were working on this a while back, but haven't heard much since).

So it should just be a matter of enabling those features.

I remember when they first announced the iMac and MacBook Pro with the X1600 they touted the ability to do hardware H.264 decoding, don't think encoding was ever mentioned. This may be more or less in reference to the MacBook, Mac mini, edu-line iMac models that use the GMA950 though and presently don't do hardware H.264.
 
First of all, yeah right, second of all, why? While H.264 produces smaller files, they're still massive compared to YouTube-ish vids, and even on the latest Core2Duo systems, they tax the processor. I know, I know, "but the quality is better!" you say. Yep, but YouTube, and the like, has shown us that most people don't give a toss about the quality, just the bare minimum to get by.

No, if anything, this is for converting DVD's for "personal" use, or it's not going to happen at all. I'd love to have process specific dedicated hardware for lots of things (think what the Cell proc was designed for), and I hope that happens like it did for GFX cards.

watching video's on youtube makes me want to gag. they are horrid quality, and i would way rather wait a lil longer for them to download.
 
It would be especially cool for the iPhone (if apple also provides a camera lens on the same side as the screen) -- just pull out your iPhone while at Starbucks (using WiFi ... I don't know about AT&T's Edge) and have a video chat with your girlfriend.

I've been sure that Jobs kept serious features waiting for the final product announcement. I think that's it.

Seriously, that "smells" right as the "one more thing" - very technically feasible, and has a serious bang. Cue "Thus spake Zarathrustra." First note, show a clip of the commercial from the 90's with a video phone and the quote, "..and the company that'll bring it to you is AT&T." Second note, show a clip of the Macintosh (or NeXT cube, grin) introduction on a pedestal. Third note, show a clip of _2001: A Space Odyssey_, with the video phone and the AT&T symbol (Cingular is rebranding themselves as AT&T). Crescendo, cut to and fade, Phil Schiller chatting with, say, a simulated kid telling him that he loves him and wishes he could be there to tuck him in tonight.

Pure magic. Instant pre-sales through the roof. Stockholders, rejoice.

Seriously, I think you've nailed it. The bandwidth is there, the hardware capability is there, the marketing case for it is compelling. There's no reason not to do it, unless power consumption and space make it unpossible :) .
 
Well, this might seem like a dumb question. Anyways, would this help for any other type of encoding? Like would this speed up the encoding in iDVD etc when compressing and encoding to burn a dvd? Also, if so would it help the processors out that much?
 
I've been sure that Jobs kept serious features waiting for the final product announcement. I think that's it.

Seriously, that "smells" right as the "one more thing" - very technically feasible, and has a serious bang. Cue "Thus spake Zarathrustra." First note, show a clip of the commercial from the 90's with a video phone and the quote, "..and the company that'll bring it to you is AT&T." Second note, show a clip of the Macintosh (or NeXT cube, grin) introduction on a pedestal. Third note, show a clip of _2001: A Space Odyssey_, with the video phone and the AT&T symbol (Cingular is rebranding themselves as AT&T). Crescendo, cut to and fade, Phil Schiller chatting with, say, a simulated kid telling him that he loves him and wishes he could be there to tuck him in tonight.

Pure magic. Instant pre-sales through the roof. Stockholders, rejoice.

Seriously, I think you've nailed it. The bandwidth is there, the hardware capability is there, the marketing case for it is compelling. There's no reason not to do it, unless power consumption and space make it unpossible :) .

Sounds great, but seriously, reading your post, the excitement, how much coffee have you had today? :D
 
First of all, yeah right, second of all, why? While H.264 produces smaller files, they're still massive compared to YouTube-ish vids, and even on the latest Core2Duo systems, they tax the processor. I know, I know, "but the quality is better!" you say. Yep, but YouTube, and the like, has shown us that most people don't give a toss about the quality, just the bare minimum to get by.

No, if anything, this is for converting DVD's for "personal" use, or it's not going to happen at all. I'd love to have process specific dedicated hardware for lots of things (think what the Cell proc was designed for), and I hope that happens like it did for GFX cards.
h.264 isn't bigger. It depends on the bitrate you encode with. Its one of if not the best video compression algorithm currently out there. You can encode at low bitrate, ie small file with poor quality, or encode 1080p in awesome quality with a file size that is still a fraction of typical HD-DVD.
 
Please enlighten me further. :)
Does this mean that my iMac (latest model) will (can) have this feature?
If so, how will it work? :confused:
No, not as specified in the article. AVIVO is not a dedicated chip, but rather part of the GPU. By enabling AVIVO, you will certainly get better performance, but that's not evidently the goal. The idea behind this posting is to include a separate chip to do H.264 encoding/decoding independent of the CPU and GPU. This in many ways is just an extension of the multiple cores theory.
That was my thought. Why put a dedicated chip in when you already have a $50 chip in there that does what you want.
For the same reason that you now have two CPUs instead of just one, even though one does what you want. Having a dedicated chip frees up the other subsystems for other tasks. With a separate unit, you could play a game while encoding H.264 in the background--something that you can't pull off too well even with mainstream AVIVO cards.
A $50 chip means a $100 difference in retail price.
Absolutely untrue--there is no single metric for pricing, except perhaps the overall margin, which would make that number about $60, not 100, and would still be a faulty conclusion to draw. Retail price isn't going to change substantially with or without this. It will simply sit in place of a price drop commensurate with its cost. When they switched from dirt cheap G4s to more expensive Core Duo processors in the PowerBook, prices didn't go up.
 
Apple's margins on hardware are about 40%, so adding a $50 chip to a MacMini is going to increase the retail price by $80. I don't see that happening.

I think the source is simply referring to Intel's next chipset, the G35, which will probably cost $50 in the quantities that Apple buys, and which would probably end up in the MacMini, MacBook, and 17" iMac in Q3/Q4 2007.
 
A $50 chip means a $100 difference in retail price. Does anyone really think they're going to raise the price of the Mac Mini by $100 when the Mini can already playback H.264 when not heavily loaded? That's a lot of money for something that is only useful to people like me who rip DVDs.

Unlikely Apple would pay retail price for the chip. Ordered is mass quantity, the price could be $10.
 
Apple TV

I'm amazed that nobody has considered the idea that this might lead to TiVo-like features built into Front Row or the Apple TV. Rather than encoding tv into MPEG-2 or the .tivo format, a Mac could encode directly into an iPod friendly H.264 format. We already know that MacOSX can directly control a cable box and pull out live video using a Firewire connection (see Firewire SDK at Apple.com). With this chip, you could connect your Mac to your cable box, pull out live video and have it encoded live into H.264. You could then watch these saved files in Front Row, on your Apple TV, or on your iPod. Sounds brilliantly easy to me.
 
FINALLY!! But let the iPhone play divx, too!

H.264 encoding is prohibitively slow, even on Core Duo machines. I believe Apple has some sort of vested interest in making H.264 the standard of encoding in the upcoming years, (especially since the ipod doesn't play divx which is, imho far superior). If they want to do this they need to make it easier on the machine. Heck, my 5.5G ipod plays H.264 SO much better than my loaded PB, its rather sad, really... however even my Treo plays 800x600 divx flawlessly, and it looks basically the same, at smaller file sizes, and much faster endoding to boot!

I reaaaaally hope the iPhone plays divx at some point (ala mplayer). Otherwise, my PalmOS Treo is superior in this capacity, especially since it can dynamically resize full-screen divx video to fit its small screen.

If Apple makes H.264 encoding easier and faster, though, I think it can begin to assert itself in the mobile video market, as iTunes easy CD ripping did for music, and this incentive may cause Apple to initially underwrite the cost of chip inclusion...
 


Robert Cringley claims that Apple is planning on incorporating dedicated H.264 decoding chips into future Mac hardware.

The article claims that incorporating a dedicated H.264 decoding chip will allow Apple to ensure the same base performance on every machine it sells. The $50 chip is said to also offer H.264 encoding to allow users to quickly encode high quality video clips for upload to the internet.

Cringely is widely regarded as nothing more than an attention whore who makes outlandish (and frequently WRONG) predictions just to stir up controversy (and site hits).

It doesn't make sense to incorporate such a chip because it wastes the modularity of computer hardware which can facilitate future codecs of any kind via software. The answer isn't in a dedicated ASIC for H.264 that will be entirely obsolete within a year... at least not for Apple. Good industrial design, not preprogrammed obsolescence, is what adds value to Apple products. In this case, good industrial design means incorporating more powerful multipurpose CPU and GPU chipsets as opposed to dedicated encoder/decoder chips.

Keep blindly throwing darts at that board, Cringely... eventually one of them will hit a bullseye by sheer luck.
 
Just a ploy for lower costs from ATI & nVidia

No way is Apple going to put in a $50 dedicated chip for all the most intelligent reasons stated above. This is probably just a strategic leak to help push ATI into lowering their royalty costs to unlock AVIVO for Apple. Kind of like reverse FUD.
 
I was addressing that, by saying that H.264 /hardware/ encoding/decoding at this point is already 'too expensive' to be worth doing. H.264 at 1080p decoding can be done /now/ in software in Apple's entire current line. That leaves encoding as the main benefit. Already, h.264 can encode at realtime on some of Apple's product line.

For Apple to be willing to drop 50$ on an h.264 chip, they either are expecting to shove HD-DVD or Blu-Ray into the entire line (not likely, I am not sure people want a 1000$ mini with 512MB of RAM and no GPU)... or they are expecting to do something HUGE with encoding (Media center competition?). By the time the chip starts doing good things for Apple's sales, software will be able to nearly overtake the hardware, and they take it back out again.

Cringely's musings, while entertaining, have really made me wonder why I looked up to the guy when I was younger. His point that this chip is superior to a software solution banks on this 'cheap' chip doing things better than a CPU, but misses the point of this cheap, specialized chip in the first place. A DVD player needs to be cheap, and you can't go around putting a 200$ CPU into it just to play DVDs... you build a custom DSP for DVDs, along with a 10$ CPU, and save more than 50% on your costs. A Computer is already expensive, so using a bunch of specialized chips doesn't make much sense, when the 200$ CPU /can/ do it all.

I think it make sense if you're trying to avoid taxing the CPU if it does other things while it's encoding and decoding. It depends on how serious Apple has become about the Mac as a media center.
 
I think it make sense if you're trying to avoid taxing the CPU if it does other things while it's encoding and decoding. It depends on how serious Apple has become about the Mac as a media center.

It also assumes that the Mac will become a TiVo. I don't see that. Microsoft doesn't even see that, but it lets you use Media Center as a TiVo because the real show isn't ready yet.

With a good implementation of IPTV, using H.264, I can see the Mac not even needing this chip, and Apple TV being able to access your next-gen Comcast subscription without the Mac as a hub (unless you want to record, maybe, and even then, it is just saving a file to disk, rather than any real encoding job).

The future is that the DVR as we see it doesn't exist. Instead, you have internet media streaming boxes (using bandwidth freed up because of the migration from old-style QAM digital cable, to IP services), both in the classic form we are used to (to avoid making people uncomfortable), but also in the form of cheap HTPCs like the Mini, the Apple TV, and even cheap PCs that don't need cable cards, hardware encoders, and so on.

One link, multiple services... IP for phone, the web, and media. Cable companies have the infrastructure in place already to provide media, it is just locked up in older cable bandwidth (A Comcast 6Mbps internet link only takes up one HD channel worth of bandwidth, so think of the local link speeds Comcast could provide to customers for media streaming of Comcast over IPTV).
 
Cringely is widely regarded as nothing more than an attention whore who makes outlandish (and frequently WRONG) predictions just to stir up controversy (and site hits).

It doesn't make sense to incorporate such a chip because it wastes the modularity of computer hardware which can facilitate future codecs of any kind via software. The answer isn't in a dedicated ASIC for H.264 that will be entirely obsolete within a year... at least not for Apple. Good industrial design, not preprogrammed obsolescence, is what adds value to Apple products. In this case, good industrial design means incorporating more powerful multipurpose CPU and GPU chipsets as opposed to dedicated encoder/decoder chips.

Keep blindly throwing darts at that board, Cringely... eventually one of them will hit a bullseye by sheer luck.

I think you're a bit off here. Cringley isn't taking credit for this but rather speculating on rumors that he's heard.

Cringley said:
Now comes the rumor I have heard, that I believe to be a fact, that has simply yet to be confirmed. I have heard that Apple plans to add hardware video decoding to ALL of its new computers beginning fairly soon, certainly this year.

Modularity of computer hardware means little with today's OS X. Developers no longer write to the metal. Everything is abstracted to the point where Apple can keep OS X as portable as possible. Look at the iPhone which runs a reduced subset of OS X for example. In fact if you look at the new QTKit which is replacing the older 32-bit Quicktime framework you'll find that it is well insulated from the hardware. So Apple in fact can add a dedicated encoder/decoder chip and the process of making quicktime use the chip shouldn't be too difficult.

If good industrial design is adding more powerful CPU and GPU then why is AMD persuing their Torrenza platform which has just what Apple is rumored to be doing here. Plug in co-processors modules linked with Hypertransport.

http://en.wikipedia.org/wiki/Torrenza

The idea makes perfect sense if the dollars match up well. Apple wants to sell HD content via iTunes. They need a low bandwidth CODEC...check h.264/AVC. Now they need a solid way to play back said content without requiring a Mac Pro.

They're not going to put in discrete graphics in Mac mini and Macbooks so forget Avivo or Purevideo. If they can put in a dedicated co-processor for AVC encode/decode a world of options open up.

iLife 07- iDVD and iMovie handle AVC based codecs much easier.
Leopard- iChat AV and QTKit handle AVC encode/decode much better
Pro- Final Cut Studio, Compressor all handle AVC easier

Yes you can use the GPU but you still have finite power per watt. Leopard is going to be using the GPU much harder than tiger with OpenGL 2.1 being multithreaded and AVC being used even more. Apple's going to need some efficient compression. They are claiming "significant" improvements to AVC encode in QTkit but we're going to need more.

This may just be a rumor but it certainly makes sense and OS X allows this to happen quite easily IMO.
 
If this is true, I'm hoping it's because Apple will introduce "Media Center" capabilities into OS X, allowing users to record and stream television shows from their computer to :apple:tv.

Or... Perhaps allow users to legally rip DVDs into iTunes (although I'm doubting this for two reasons: right now it's illegal, and it might affect iTunes movie sales).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.