I really don't see the point in the more consumer macs, un less they are going to really push h.264 in leopard.
I really don't see the point in the more consumer macs, un less they are going to really push h.264 in leopard.
I agree.This guy couldn't read the ingredients off of a box of cereal and get it correct. A chip cost cited at $50 is meaningless, Apple has huge buying power and can negotiate prices way, way down below even wholesale prices (buying RAM from Apple is one of biggest rip-offs ever).
Furthermore Apple isn't going to simply "eat" the cost, it will be figured in somehow with reduced costs of some other component or amortized over time as with every other product roll out.
This may or may not happen, but with this guy place your money on it NOT happening (the way he described it, at least).
To me, this is like Apple adopting the Amiga philosophy a few decades late. Amiga worked around the limits of chips of their day by having several specialized chips. One for graphics, one for sound, General CPU, etc.
So a 50$ chip will be faster than the core 2 duo monsters?
http://ati.com/technology/Avivo/index.html
Yeah Avivo is in there. It just needs to be enabled. Why put additional hardware when the GPU is there waiting to be used to encode/decode. I know it took some time for Windows get get a driver for it.
Most of the talk that I've heard is that it is ridiculous to go beyond what the GPU does.
I checked out the ATI site. An X1600 is supposed to handle up to 720p. However, that isn't hardware decoding, it is "hardware assisted" decoding. CPU usage doesn't go from 80% to 0%, it goes from 80% to 60%. The DMS-02 does 720p on its own, no CPU usage at all (well, it includes its own dual core CPU), and uses 1 Watt of power. That's something you can put into a MacBook or even an ultraportable.
How bored were you?!This reminds me of one time, I was thinking about OpenGL, and OpenAL and wondered what other Open*Ls there were.
The new ATi chips have H.264 decode/encode built in (AVIVO). All this means is that all the Mac's will be getting ATi chips (or the nVidia equivalent). There won't be a dedicated chip other than that.
I don't get what he thinks the big hoopla is about. Anyone with half a brain could have predicted this. (Except maybe in the Mini or Macbook which don't have ATi chips but may in the future have them)
That's why they are going to use the newer Chips to do the encoding. Because they have full hardware decoding.
Intel could add it to their pfferings as well. Some of the machines use the integrated Intel graphics. Apple could be looking at going with one supplier though; either ATI or Nividia and getting rid of Intel for the GPU on the lower machines.
But still not 0% CPU utilization like a purpose-built dedicated chip. This is why even though graphics hardware has had hardware MPEG2 decoding/encoding for several years now, until recently, MPEG cards were not uncommon in graphics machines. AVIVO is not the dedicated chip referenced by the article.That's why they are going to use the newer Chips to do the encoding. Because they have full hardware decoding.
Do they? And how do they fit into a notebook?
But still not 0% CPU utilization like a purpose-built dedicated chip. This is why even though graphics hardware has had hardware MPEG2 decoding/encoding for several years now, until recently, MPEG cards were not uncommon in graphics machines. AVIVO is not the dedicated chip referenced by the article.
That's not to say that the article is correct, but it would specifically be a dedicated chip and NOT part of the GPU.
I don't want to encode H.264, i don't want to pay extra for the priviledge of doing so. Not unless it comes in a better graphics card which lets me play games and do 3D better.
My thoughts exactly, but Apple has a tendency of doing things like this; e.g. not many people at all use Front Row on a Macbook's small screen, but the remote is included, upping the price...
How bored were you?!
If your desire is to improve performance in the living room and multimedia operations, then the dedicated chip is better. If you're a professional user (the target of the higher-end models), there's a strong chance you do need to encode properly. If you're a consumer, there's a strong chance that you'll be using H.264 encoding on your television, for your future iPod, or with your video camera. If you have no interest in any of these things, then simply ignore it.I have never, ever, encoded anything on my Mac. So if i had the choice between a better graphics card which helps encoding and a dedicated chip i'll go with the better graphics card anyday.
First of all, the two are not mutually exclusive. Better graphics cards are completely irrelevant to the hypothetical proposal of dedicated H.264 hardware (which is as much about decoding as encoding). More to the point, however, the prices of Apple products don't fluctuate. They set price points and they include hardware based on those parameters. An iMac is going to be $999 either way. A dedicated chip might be better put to use with a more expensive overall GPU in your view.I don't want to encode H.264, i don't want to pay extra for the priviledge of doing so. Not unless it comes in a better graphics card which lets me play games and do 3D better.
This implies something more than upgrading the GPU, since both the mini and the MacBook don't have a GPU.
Maybe a better solution would be to go cold turkey and stop using H.264 which is bloated junk to begin with.
You can get quite nice looking movies using the existing (Sorenson?) Quicktime codec, and the files are not a huge as people make out.
Just choose a resolution that is slightly lower than 1080i but still noticeably (to the average person) higher than 640x480.
Maybe a better solution would be to go cold turkey and stop using H.264 which is bloated junk to begin with.