Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I really don't see the point in the more consumer macs, un less they are going to really push h.264 in leopard.

iTunes movie downloads- h.264
iChat Theater - h.264
Quicktime improvements- h.264
Apple TV support- MPEG4 which includes h.264

h.264 is very important to Apple. It's the codec of choice.
 
This guy couldn't read the ingredients off of a box of cereal and get it correct. A chip cost cited at $50 is meaningless, Apple has huge buying power and can negotiate prices way, way down below even wholesale prices (buying RAM from Apple is one of biggest rip-offs ever).

Furthermore Apple isn't going to simply "eat" the cost, it will be figured in somehow with reduced costs of some other component or amortized over time as with every other product roll out.

This may or may not happen, but with this guy place your money on it NOT happening (the way he described it, at least).
I agree.

Also... how will this aid PVR or media center functionality? My Mac Mini and EyeTV doesn't need to do ANY video encoding... The digital TV stream is already compressed. The EyeTV software just captures it and saves it to disk.

The only reason I can see this happening is if Apple decide to turn iTunes into a DVD ripping and library application as well (which is possible, but doubtful).
 
Come ti think of it, a dedicated processor for H.264 encoding/decoding would be very welcome. My generation 1 Macbook gets really busy when playing 720p content, especially when I have all those "always on" apps like Mail, Firefox, iChat, Skype, iTunes, iCal, running all the time.

I'm totally satisfied with the processing power of the Macbook, I had to pimp out the hard drive and RAM to the maximum to fit my needs tho. The only thing that is painfully slow is when I encode my home movies into H.264 (my camera only record uncompressed, i.e. 8 minutes = 1 GB). It just takes hours and the fans are on max all the time. With that chip, the CPU would be free for other useful stuff and, considering that chip only takes 1 Watt, the whole thing would be much more quiet.

It's also annoying to have videos skip a few frames because some other app decides to eat up all of the CPU power for a millisecond.
 
To me, this is like Apple adopting the Amiga philosophy a few decades late. Amiga worked around the limits of chips of their day by having several specialized chips. One for graphics, one for sound, General CPU, etc.

It's just that today, the GPU's and North and Southbridge chips work pretty much like the old Amiga chipset. The CPU of today also has more cycles to spare then it did back then.

I was thinking anlong the same lines as you when I read the report but I had a different take. After the Amiga 3000 was released, they were working on the Amiga 3000+. The system engineer, Dave Haynie, had an AT&T DSP on the motherboard. Depending on the configuration, the DSP could become an audio processor or a full modem. Sadly, Commodore was too short sighted to actually bring this to market.

I recommend a book called "On The Edge". It's a pretty faithful account of the early years of computing, without the Apple bias. (Please note that I'm a Mac user and Apple shareholder. But I know the history and know that Apple wasn't the only one out there back in the day.)
 
H.264 is used in Blu-Ray movies, and all the hi-def (and standard def) quicktime stuff. It would be nice if Apple includes a competent h.264 decoder that's better than what NVIDIA/ATI provides, because their solutions don't use any advanced features to improve the image for HD content.

On another note, someone in the know blurbed out that the next Apple Cinema HD displays will support HDMI (presumably spec 1.3a) and wide gamut colors (supports 10 bit per channel), and they should be available Real Soon Now(tm).

These are exciting times...
 
So a 50$ chip will be faster than the core 2 duo monsters?

Probably at encoding/decoding h.264. The C2D is a general purpose chip: it's not really optimized to do any one specific thing. This $50 (or however much) chip is probably specifically optimized to encode/decode h.264, but can't do other stuff. It's basically a trade-off: do you want something that does everything at a an ok speed or something that 1 thing, but blazing fast?

This reminds me of one time, I was thinking about OpenGL, and OpenAL and wondered what other Open*Ls there were. I tried every other letter of the alphabet and found openrl, an API designed to help make movie software, which was designed for this add-on card by Aspex Semiconductors. This card sounds like it does the exact same thing. Check it out at www.aspex-semi.com
 
http://ati.com/technology/Avivo/index.html

Yeah Avivo is in there. It just needs to be enabled. Why put additional hardware when the GPU is there waiting to be used to encode/decode. I know it took some time for Windows get get a driver for it.

Most of the talk that I've heard is that it is ridiculous to go beyond what the GPU does.

I checked out the ATI site. An X1600 is supposed to handle up to 720p. However, that isn't hardware decoding, it is "hardware assisted" decoding. CPU usage doesn't go from 80% to 0%, it goes from 80% to 60%. The DMS-02 does 720p on its own, no CPU usage at all (well, it includes its own dual core CPU), and uses 1 Watt of power. That's something you can put into a MacBook or even an ultraportable.
 
I checked out the ATI site. An X1600 is supposed to handle up to 720p. However, that isn't hardware decoding, it is "hardware assisted" decoding. CPU usage doesn't go from 80% to 0%, it goes from 80% to 60%. The DMS-02 does 720p on its own, no CPU usage at all (well, it includes its own dual core CPU), and uses 1 Watt of power. That's something you can put into a MacBook or even an ultraportable.

That's why they are going to use the newer Chips to do the encoding. Because they have full hardware decoding.
 
The new ATi chips have H.264 decode/encode built in (AVIVO). All this means is that all the Mac's will be getting ATi chips (or the nVidia equivalent). There won't be a dedicated chip other than that.

I don't get what he thinks the big hoopla is about. Anyone with half a brain could have predicted this. (Except maybe in the Mini or Macbook which don't have ATi chips but may in the future have them)

Intel could add it to their pfferings as well. Some of the machines use the integrated Intel graphics. Apple could be looking at going with one supplier though; either ATI or Nividia and getting rid of Intel for the GPU on the lower machines.
 
Intel could add it to their pfferings as well. Some of the machines use the integrated Intel graphics. Apple could be looking at going with one supplier though; either ATI or Nividia and getting rid of Intel for the GPU on the lower machines.

The graphics card basically comes free with the northbridge. However what they could do is use the new ATi/AMD northbridge which has a built in Radeon X1250 (or similar). Personally i hope they stay with Intel for the northbridge though.
 
That's why they are going to use the newer Chips to do the encoding. Because they have full hardware decoding.
But still not 0% CPU utilization like a purpose-built dedicated chip. This is why even though graphics hardware has had hardware MPEG2 decoding/encoding for several years now, until recently, MPEG cards were not uncommon in graphics machines. AVIVO is not the dedicated chip referenced by the article.

That's not to say that the article is correct, but it would specifically be a dedicated chip and NOT part of the GPU.
 
Do they? And how do they fit into a notebook?

They just use a different graphics card

But still not 0% CPU utilization like a purpose-built dedicated chip. This is why even though graphics hardware has had hardware MPEG2 decoding/encoding for several years now, until recently, MPEG cards were not uncommon in graphics machines. AVIVO is not the dedicated chip referenced by the article.

That's not to say that the article is correct, but it would specifically be a dedicated chip and NOT part of the GPU.

I have never, ever, encoded anything on my Mac. So if i had the choice between a better graphics card which helps encoding and a dedicated chip i'll go with the better graphics card anyday.

I don't want to encode H.264, i don't want to pay extra for the priviledge of doing so. Not unless it comes in a better graphics card which lets me play games and do 3D better.
 
I don't want to encode H.264, i don't want to pay extra for the priviledge of doing so. Not unless it comes in a better graphics card which lets me play games and do 3D better.

My thoughts exactly, but Apple has a tendency of doing things like this; e.g. not many people at all use Front Row on a Macbook's small screen, but the remote is included, upping the price...
 
My thoughts exactly, but Apple has a tendency of doing things like this; e.g. not many people at all use Front Row on a Macbook's small screen, but the remote is included, upping the price...

You're right, although i'd be complaining more about the speakers... you wouldn't be able to hear it if you were any distance away from it.

However i do use my remote a lot with external speakers and an external monitor if i'm doing more than listening to music. It's very good for parties too (not that it is a good idea to have an expensive laptop out in the middle of a group of drunkards :) ).
 
I have never, ever, encoded anything on my Mac. So if i had the choice between a better graphics card which helps encoding and a dedicated chip i'll go with the better graphics card anyday.
If your desire is to improve performance in the living room and multimedia operations, then the dedicated chip is better. If you're a professional user (the target of the higher-end models), there's a strong chance you do need to encode properly. If you're a consumer, there's a strong chance that you'll be using H.264 encoding on your television, for your future iPod, or with your video camera. If you have no interest in any of these things, then simply ignore it.
I don't want to encode H.264, i don't want to pay extra for the priviledge of doing so. Not unless it comes in a better graphics card which lets me play games and do 3D better.
First of all, the two are not mutually exclusive. Better graphics cards are completely irrelevant to the hypothetical proposal of dedicated H.264 hardware (which is as much about decoding as encoding). More to the point, however, the prices of Apple products don't fluctuate. They set price points and they include hardware based on those parameters. An iMac is going to be $999 either way. A dedicated chip might be better put to use with a more expensive overall GPU in your view.

In the view of much of the public, however, the cost of OS X would be better put to use in dropping prices so they could run Windows on a pretty Mac. On the flip side of that same coin, there are plenty of Mac users who wish all the money spent (and cost incurred) on the little features and stylish design went into more expensive video cards or bigger hard drives or more RAM. "Who needs magnetic latches or illuminated keyboards or aircraft-grade aluminum enclosures? Put a [hotshot gamer video card] in instead!"

Of course, if this idea does happen, there will be a dozen threads about how Apple missed a big opportunity and how their products are just outrageous and unacceptable. Never mind the reality that all the people wanting real media center capabilities will be thrilled, and that Apple sales would remain strong, and the move will inevitably be copied by other manufacturers.
 
Maybe a better solution would be to go cold turkey and stop using H.264 which is bloated junk to begin with.

You can get quite nice looking movies using the existing (Sorenson?) Quicktime codec, and the files are not a huge as people make out.

Just choose a resolution that is slightly lower than 1080i but still noticeably (to the average person) higher than 640x480.
 
Maybe a better solution would be to go cold turkey and stop using H.264 which is bloated junk to begin with.

You can get quite nice looking movies using the existing (Sorenson?) Quicktime codec, and the files are not a huge as people make out.

Just choose a resolution that is slightly lower than 1080i but still noticeably (to the average person) higher than 640x480.

Sorenson doesn't play on anything other than Quicktime and ffmpeg. Not to mention QT has about 30 different codecs that can be used. The only portable ones are H.264 and MPEG-4.

I don't even see the point in adding chips for decoding in the first place. It isn't like MPEG-4 and H.264 have huge decoding issues (like MPEG-2 had back when DVD drives were being added to laptops and desktop).

For encoding, I can kind of see the point, but then again, you also have most of Apple's line already capable of realtime encoding. Not sure what the gain will be there.

(As for the bloated junk comment, compression isn't exactly as clear-cut as people think. To get better results without sacrificing apparent quality, more 'work' needs to be applied during the encoding/decoding process. There is already work in wavelet codecs which look better than H.264 and MPEG-4 at the same filesize, but they are even more CPU intensive)
 
Maybe a better solution would be to go cold turkey and stop using H.264 which is bloated junk to begin with.

If you think that h.264 is "bloated junk" then I seriously suggest that you don't have the slightest clue what you are talking about.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.