Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

milo

macrumors 604
Sep 23, 2003
6,891
522
And real-time, uncompressed HD is exactly what the video cards deliver.

As I said in an earlier post, you're talking about high quality, high bitrate HD.

Uncompressed is 1-3 gigs per second. We're talking terrabytes for an hour of footage. No consumer cards will ever support that, only high end pros use uncompressed HD.
 

motulist

macrumors 601
Dec 2, 2003
4,234
611
Depending on what this chip really does, it may allow Apple to support Blu-Ray movie playback without ruining the rest of Mac OS X.

One of the big problems with Windows Vista is that it supports software playback of HD content. Thanks to the requirements imposed by the movie studios, Microsoft has had to specify and develop (whether it works or not) massive amounts of DRM infrastructure, including CODECs that keep the video encrypted during the decoding process, the ability to selectively detect and disable video cards that don't observe the rules, tilt-bits to detect bus-snoopers, etc.

I suspect Apple doesn't want to play this game. If I were them, I'd refuse to compromise a good multimedia architecture simply because some movie studios demand it.

One possible way around all this might be a dedicated HD-decode chip, as a part of the video card. Mac OS can then feed the raw, encrypted, content from the disc straight to the chip, where the video goes straight to the video output, without ever passing through main memory. Sort of like the overlay model used by video-playback cards, back before CPUs were powerful enough to play real-time video.

This way, the studios get their insane encryption requirements met and Mac OS doesn't get crippled by the attempt to enforce it.

And, of course, when you're not playing a BD movie, you've got a nice powerful auxiliary GPU that can be used for whatever else the system needs.


Ah! Now here's an angle that makes sense! With today's powerful hardware in even low end systems, a dedicated chip for encoding and decoding h.264 makes almost no sense. Even the low end should be capable of encoding and decoding h.264 in realtime at high quality and without too crazy of a tax on CPU or battery power. But the idea is very very sensible if you're putting in a chip specific to blu-ray itself so that apple can have blu-ray playback while completely avoiding having to infect OS X with any ridiculous deep OS-level DRM that blu-ray requires by law.

Great analysis shamino!
 

milo

macrumors 604
Sep 23, 2003
6,891
522
That's my thinking as well. Might have been worthwhile a ... well, a while ago, but it's really needed in older machines, not newer ones. Increasingly powerful processors have pretty much rendered it obsolete. It's kind of like running a Radeon 9600 Pro in a Intel 2 Core 3 GHz + system.... it can probably run games just as fast with software rendering.

It's not completely obsolete - while most machines can play back HD pretty easily, a dedicated graphics chip can do it using much less power. That's a huge benefit on a laptop where hardware acceleration can boost battery life for watching movies.

And encoding is always a slow process, it can never be too fast, and it would be great to be able to do it on a dedicated chip in the background so encodes can run full speed and not bog down your machine for other uses.

With today's powerful hardware in even low end systems, a dedicated chip for encoding and decoding h.264 makes almost no sense.

Except for battery life on portables.

CPU's are really powerful, but they are also much higher power hogs than dedicated chips.
 

motulist

macrumors 601
Dec 2, 2003
4,234
611
As I said in an earlier post, you're talking about high quality, high bitrate HD.

Uncompressed is 1-3 gigs per second. We're talking terrabytes for an hour of footage. No consumer cards will ever support that, only high end pros use uncompressed HD.

And that's exactly why that's not what Apple would be doing. For the vast, vast, vast majority of users, blu-ray level hi-def is already way way way higher quality than they're ever gonna even care about for the next 10 years at least.
 

gnasher729

Suspended
Nov 25, 2005
17,980
5,565
A couple of years ago at MWSF I asked an Apple laptop engineer about this as a recent article had mentioned that the GPU could do H.264 encoding. He said they weren't doing that as they got better quality output by doing it in software. That raises the question: Are all H.264 encoders the same or is there room in the implementation to trade off final quality for speed or simplicity of the algorithm? So if Apple decides to add a chip for H.264 encoding it may have to do with any number of other features besides simply saying it does H.264 encoding in hardware.

h.264 encoding is absolutely a compromise between processing power used and quality. There are two main reasons for this.

The first reason is, a lot of the improvements in h.264 compared to MPEG-2 for example come from the fact that h.264 allows use of a variety of different algorithms. Some algorithms work better for some scenes (or parts of some scenes), some work better in others. To make use of this, an encoder needs to try say 16 different methods of encoding the video data, and then pick the one that gave the best/smallest results. Of course trying 16 different methods takes longer than trying only four or only one method.

The second reason is motion prediction. In basically all encoding methods, the most important thing is finding a block of pixels in a previous frame (or in multiple previous frames) that looks similar to the block you are encoding. The more similarity you find, the better the compression. But finding similar frames basically means comparing each piece of one frame with as many pieces as possible from previous frames. If you compare against more pieces from other frames, you take more time, but you find more similarity and get better compression. So that is again a trade-off between time and quality.

Another trade-off happens with two-pass encoding. What happens there: There are always some parts of a movie that can be compressed better than others. To get the overall best quality for a given movie size, you'd first find which parts look good with more compression and which ones are harder to compress / need more bytes to look good. Then you compress some parts a bit more, and other parts a bit less. This gives overall the best quality at any given total size, but it means you first make a pass analysing the movie (which does basically all the work apart from writing the compressed movie to disk), and then another pass doing the real compression. That's twice the work for some quality improvement. (Without two-pass compression, you will have some parts of the movie looking really good, and some parts looking rubbish).
 

dicklacara

macrumors 6502a
Jul 29, 2004
973
1
SF Bay Area
I can see this happening...

I read the recent Cringley post:

http://www.pbs.org/cringely/pulpit/2008/pulpit_20080801_005339.html

and it seemed pretty logical...

What if the chips would be applied "across the entire line" quote means the entire line of Apple products that display or process video, including: iPhone, AppleTV, some iPods, as well as Mac computers.

YouTube changed its codec to h264 to support the iPhone (resulting in better quality, smaller files, less bandwidth). Could other social sites find advantage in doing the same.

For instance, consider the site

http://live.yahoo.com/

This site offers AV Lifecasting in real time. It features a live 400x300 video of the primary broadcaster and live videos of, up to 4, 160x120 concurrent secondary broadcasters.

Anyone can sign up, setup a free lifecast channel and go live in minutes. The channel content includes: gossips, goof-offs, entertainers, news, security monitoring...

Currently, this is all delivered with Flash (unavailable on the iPhone) and probably not practical on most mobile devices. What if, Yahoo saw an advantage in offer h264 streaming to enable practical use of the site for mobile devices?

As a real-world example, there is a site I visit every week:

http://live.yahoo.com/sheenatv

It features Sheena Melwani an up and coming singer/composer/musician doing a live broadcast & AV chat at 5:30 - 6:30 PM PDT every Monday. The audience includes a group of 20-30 regulars and 40 or more drop-ins.

But, soccer season has begun for the 3 grandkids & the 2 boys have practice 4:30 - 7:00 (with driving). I can't use my iPhone to access the site (Flash) and there is no WiFi in the middle of the park. I am investigating using the iPhone to tether my MacBook, or using the iPhone via a VPN to a iMac at home... the latter works better but has no audio.

What I would really like is to view SheenaTV directly on the iPhone over EDGE so I could tune in, watch, listen and chat whenever the opportunity was available.

Maybe a future iPhone rev would even allow me to [broadcast] AV chat.

Is this a niche market? Maybe so, maybe no! The social experiments of sharing live video are in their infancy, but seem to be well-received. Who can say what effect h.264's improved quality, performance and lower bandwidth will have?

It would be something if we could access the AV of choice, on the device(s) of choice at the time and place of our choosing.

Is it "King Content" or King Content-delivery?
 

slackpacker

macrumors 6502a
We're talking real-time, uncompressed HD not "good enough, compressed HD content that isn't real-time." We're also hinting at real-time raytracing that needs an chipset that OpenCL can leverage without taxing the GPU or any free cores.

Wow everybody....

BluRay is compressed so thats where a GPU would come in handy.

Realtime Uncompressed playback of HD needs FAST HD Array's and DATA throughput rather than a GPU which would do nothing to move the massive amounts of RAW data that HD video is.

3D/RayTrace is basically realtime now.... Its called NVIDIA cards GPU.
 

Manic Mouse

macrumors 6502a
Jul 12, 2006
943
0
Just add GPUs to the Mini and MB, bingo .h264 decoding on all Macs. Can also be used with Open CL. Everyone wins.
 

kabunaru

Guest
Jan 28, 2008
3,226
5
Just add GPUs to the Mini and MB, bingo .h264 decoding on all Macs. Can also be used with Open CL. Everyone wins.

I think that if the MacBook gets a dedicated graphics card, it will fly off the shelves literally. The more people buy it, the more it is going to make up for the lost MacBook Pro sales. ;)
 

Manic Mouse

macrumors 6502a
Jul 12, 2006
943
0
I think that if the MacBook gets a dedicated graphics card, it will fly off the shelves literally. The more people buy it, the more it is going to make up for the lost MacBook Pro sales. ;)

The BlackBook should always have had a dedicated GPU, it was meant to be the 12" PB replacement was it not?
 

kabunaru

Guest
Jan 28, 2008
3,226
5
The BlackBook should always have had a dedicated GPU, it was meant to be the 12" PB replacement was it not?

How about a standard 13" aluminium MacBook as a 12" iBook G4/13"MacBook replacement and 13" black aluminium MacBook as a 12" PowerBook G4 true replacement? Both will have dedicated graphics cards but the black aluminium MacBook will have a slightly better graphics card than the standard aluminium MacBook.
Also, price the standard aluminium MacBook at $999 and black aluminium MacBook at $1299 and both should get LED screens.
Also, remove the combo drive from the base model.

Other than that, I don't know what else to say about the new MacBooks.
 

MacGeek7

macrumors 6502a
Aug 25, 2007
766
14
Blu-ray Macbook Pro and Macbook Tuesday. :D

I wish....or should I say iWish?

Either way, I've put off purchasing DVDs until Apple incorporates Blu-Ray because I don't want to watch DVDs in a Blu-Ray player and I don't want to have to upgrade my DVD collection either
 

tkiss

macrumors regular
Jun 10, 2008
120
0
I don't know very much about video cards, but doesn't dedicated video card mean better graphics, and the fat that I will actually be able to play Spore on a Macbook when it comes out?
 

mdriftmeyer

macrumors 68040
Feb 2, 2004
3,792
1,914
Pacific Northwest
If I'm not mistaken all current GPUs operate using rasterization not raytracing so I don't know what nVidia GPU you are talking about.

You're not mistaken and they don't trace vectors on-the-fly, at differing graphics depths, w/ or w/o anti-aliasing in independent views, etc.

People are also forgetting about Resolution Independence needing this dedicated Chipset [Vector Pipelines on steroids with a DSP on a separate chipset to handle other aspects of encoding/decoding] to deal with the heavy lifting of the heavy matrix transforms without taxing the CPU and lagging the WindowServer.

This isn't a software encoding solution, but it's an OpenCL implementation to leverage dedicated hardware to do this without taxing CPU cores or a GPU until the GPU is needed for threads to aide in rasterizing lighting, textures, etc.

Dedicated units to do specific tasks in huge pipes rapidly, on-the-fly, at at least 60fps will be a requirement to be smooth in views already outputing HD while it's parentView is being resized, moved and not chopping the video output.

Being able to add depth at varying color depths, layer levels for basic UI manipulation behavior also requires more off-loading.

This isn't just for people who want to use Final Cut Pro or Photoshop with > 4G of Ram on images at > 4K pixels using hundreds of layers.
 

mdriftmeyer

macrumors 68040
Feb 2, 2004
3,792
1,914
Pacific Northwest
Siggraph 2008

http://www.khronos.org/news/events/detail/siggraph_2008_los_angeles_california/


Beyond Programmable Shading: Fundamentals

SIGGRAPH Core | Thursday, 14 August | 8:30 am - 12:15 pm | 403 AB
Shader Guest Speakers:

11:30 OpenCL Aaftab Munshi, OpenCL WG

Who is he?

Aaftab Munshi is the spec editor for the OpenGL ES 1.1 and 2.0 specifications. Now at Apple, he was formerly senior architect in ATI’s handheld group.

OpenCL Working Group
OpenCL Working Group and how it impacts OpenGL Neil Trevett, OpenCL WG

Neil Trevett: Nvidia

The link gives an overview of the topics from AMD, Nvidia and Intel.
 

kjs862

macrumors 65816
Jan 21, 2004
1,297
24
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 2_0_1 like Mac OS X; en-us) AppleWebKit/525.18.1 (KHTML, like Gecko) Version/3.1.1 Mobile/5B108 Safari/525.20)

Sounds interesting. Seems to a lot of news these days about optimizing code for multi-core and increasing the GPU role in processing.
 

InkMaster

macrumors 6502a
Nov 30, 2007
522
1
Nagoya, Japan
Sigh... I too want a 720p iSight but theres a problem...

...the problem is called ISP asshats. With what for example Comcast gives you (45KB up) you can't even stream 640x480 in full quality - ever see what you get on the other end? Blurry blurry crap.

Problem isn't with the quality of the webcams as it is w/ the service to transfer that video...

I mean it doesn't matter if Apple throws in a 1080p 3CCD camera in place of an iSight, the quality on the other end won't change... :(
 

bigwig

macrumors 6502a
Sep 15, 2005
679
0
I want to be able to rip Blu-Ray to hard disk and distribute it to any player in my house. My poor Mini isn't quite up to the task, but with dedicated hardware it would be a sweet setup (assuming Apple or the Blu-Ray dweebs themselves don't actively get in the way of doing this).
 

ryanw

macrumors 6502
Oct 21, 2003
307
0
seems redundant

I would think the only move apple would make is to incorporate a minimum level graphics card that has hardware h.264 encode/decode support. My macpro has a graphics card that does the H.264 encode/decode. I believe macbook pro's do too..
 

Joe The Dragon

macrumors 65816
Jul 26, 2006
1,022
474
Yes the CPU is still involved in some parts of decoding and it does depend on the GPU implementation, but generally when they say they support GPU h.264 decoding, there is still a massive difference.

http://www.hkepc.com/?id=1510&page=5&fs=idn#view

Near the bottom is a comparison of h.264 and Blu-ray decode between the GMA X3500 in the G35 chipset which does not support h.264 decoding and the GMA X4500HD in the G45 chipset which does. CPU utilization dropped from 86% to 10% for h.264 file decode and 95% to 23% for Blu-ray decode. And this is on a 1.6GHz Celeron. Even the GMA X4500 IGP has no problems with h.264 decode. It's just not a worthwhile feature to buy a separate chip for now that even Intel IGPs support it just fine.

Incidentally, on these early drivers, the GMA X4500HD is actually within 10-30% of the performance of AMD's HD 3200 IGP which is on mature drivers. Which is fairly decent for an Intel IGP.
and now ati has the new 3300 IGP and boards that have side port ram giving it even bigger boost.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.