Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Sesshi

macrumors G3
Original poster
Jun 3, 2006
8,113
1
One Nation Under Gordon
FCS + general purpose computing for a new starter and apparently a longtime Stevezombie.

My guru who's away thinks it's a tie given what he's (and I have) read, but since we've never ordered a 2600XT Pro none of us know for sure. In terms of raw GPU potency it should be a no-brainer, but I've been reading about the CI problems with the 8800GT which doesn't surprise me - they can't write Vista drivers to save their lives so why would the Apple situation be any different?

The question is, would one 8800GT driving two 23" monitors do a better job of Final Cut and general purpose computing than two 2600XT's?

Thanks for any non-conjecture replies.
 
Two 2600s won't be any faster than one 2600, but the 2600 will likely be on par/outperform the 8800 in CI apps. I've not seen any benchmarks that back this up though, and a driver update down the line may resolve this. For now though, there's a general consensus that the 8800 is a gaming card... and that's it.
 
Eh...video cards don't really make a difference in video editing. The power of your card only comes into play when getting into stuff like Motion or After Effects.
 
AFAIK you won't seen any benefit taking a "one GFX card per monitor" approach. More RAM would probably be a better investment than a second GFX card.


Lethal
 
Eh...video cards don't really make a difference in video editing. The power of your card only comes into play when getting into stuff like Motion or After Effects.

I was thinking for I rendering. I don't expect anything uber-jazzy at this point in time but just thinking.

I've decided to order the single 2600, and we'll see how it goes from there. Thanks for the comments.
 
I was thinking for I rendering. I don't expect anything uber-jazzy at this point in time but just thinking.

I've decided to order the single 2600, and we'll see how it goes from there. Thanks for the comments.
Rendering is all up to the CPU unless you are using Motion or Color.


Lethal
 
I was just speculating that given CI's reliance of OpenGL (if I'm given to understand correctly), whether NVidia might be artificially crippling the 8800GT even on the Mac so as not to show up the Windows cards and more importantly the FX5600.

I try not to get too in-depth with this sort of thing, but there are posts out there which claim to have successfully softmodded the Geforce equivalent of Quadro cards ending up with similarly enhanced OpenGL performance, but similarly crippled DirectX performance. So is the CI inferiority in the 8800 an indirect attempt by NVidia to preserve the overall status of the 8800 in their lineup on all platforms?

Or am I just spouting conspiracy theories?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.