Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Common resolution for the big LCDs
True, but I always get the impression from folks here that gaming (because lets be honest nothing else drives cards like gaming) isn't done at resolutions higher than 1620x1280.
nVidia's D8E should arrive in January.
The D8E is Nvidia's answer to Radeon 3870 X2 dual chip card. It is based on two G92 (8800GT) chips.
The D8E will be a high-end video card.

I have not heard as yet of a single card replacement for the 8800GTX.
Most likely Nvidia will convert the GTX to a G9X card as the OC versions of the G92 that the retailers sell will outperform it, @ <1920x1200 resolutions.
You've got to be talking PC because it won't be in the Mac Pro.
You never know. If Apple waits to update the MP it is possible. Or at least it could be a BTO option. Of course this is assuming that Apple let Nvidia write the drivers and not try to do it themselves.
 
I know that within my group of gamer friend that 1280 x 1024 (17" LCDs) were the standard until about last year.

We just made the bump up to 1680 x 1050 this year.
 
2560x1600 is common for a 26-30 inch LCD. 26-30 inch LCDs are NOT common amongst any user grouping - especially gamers. Gamers cannot play a game smoothly at that resolution without spending 1200 dollars on an SLI rig (and thats JUST the video card pricing).

1920x1200 is the max ANY gamer should go for quite a long time - (24 inch LCD).
 
2560x1600 is common for a 26-30 inch LCD. 26-30 inch LCDs are NOT common amongst any user grouping - especially gamers. Gamers cannot play a game smoothly at that resolution without spending 1200 dollars on an SLI rig (and thats JUST the video card pricing).

1920x1200 is the max ANY gamer should go for quite a long time - (24 inch LCD).
1920x1200 is fine as long as you are using tons of AA and AF. Otherwise I stick to my comment of the G92 and G80 being overkill for most Mac gamers.

I know that within my group of gamer friend that 1280 x 1024 (17" LCDs) were the standard until about last year.

We just made the bump up to 1680 x 1050 this year.
But do you guys turn up the IQ settings? Or do you play the games looking like PS2 level stuff?
 
But do you guys turn up the IQ settings? Or do you play the games looking like PS2 level stuff?
Let's see what we have

6800 GT - dual 19" CRT
7600 GT - 17" LCD
7950 GT - 22" LCD
3850 - 22" LCD

We can push say Joint Operations and Battlefield 2 to high but it takes a bit more for Company of Heroes and World in Conflict.

I'm sporting the 3850 and World In Conflict runs great on High Settings until every explosive weapon known to man shows up and debris have to be rendered.

At best we fall into performance midrange. $130-280 is the range we shoot for a video card.
 
Because the mac pro hasnt had a video card upgrade in 2 years. They wont do it now because its not a gaming machine.

OK...... Apple sign deals with EA so they can license games for OSX. So what mac in the Apple line IS the gaming machine? It certainly isn't the Mac Mini or Macbook! The iMac GPU sucks as well, so that leaves the MBP and MP.

The MP being the top level machine, one would expect it to be the gaming machine as well. Or at least one of the gaming options.

All i can say it's going to happen in the next 8 weeks. What GPU... who knows. It's more than likely to be an 8800 or some kind or 3 series ATi. If it's either of those then i'll be happy.
 
Let's see what we have

6800 GT - dual 19" CRT
7600 GT - 17" LCD
7950 GT - 22" LCD
3850 - 22" LCD

We can push say Joint Operations and Battlefield 2 to high but it takes a bit more for Company of Heroes and World in Conflict.

I'm sporting the 3850 and World In Conflict runs great on High Settings until every explosive weapon known to man shows up and debris have to be rendered.

At best we fall into performance midrange. $130-280 is the range we shoot for a video card.
Nice, I don't do too much pc type gaming anymore. The upgrading cost was starting to kill me. Any reason why you didn't get an 8800GT instead of the 3850?
 
Nice, I don't do too much pc type gaming anymore. The upgrading cost was starting to kill me. Any reason why you didn't get an 8800GT instead of the 3850?
8800 GT - gouged to $289-319 and out of stock

3850 - $179 and bought on launch day

I haven't overclocked it yet but it's a killer card for $179. Still, I probably should have gone with the 3870 now that I think about it some more.

I only spent $640 on my gaming machine mind you.
 
Won't there be like some sort of 8800GTX-TypeR-Turbo card along soon (and the questions of "will it fit on the Pro to replace the pathetic card that Apple shipped?") to repeat this cycle of OMG-age? The 8800GTX is now pretty old.
 
8800 GT - gouged to $289-319 and out of stock

3850 - $179 and bought on launch day

I haven't overclocked it yet but it's a killer card for $179. Still, I probably should have gone with the 3870 now that I think about it some more.

I only spent $640 on my gaming machine mind you.
Figures, I just looked, the PNY 8800GT is only 269 @ Newegg. But the raised prices are par for course.

Won't there be like some sort of 8800GTX-TypeR-Turbo card along soon (and the questions of "will it fit on the Pro to replace the pathetic card that Apple shipped?") to repeat this cycle of OMG-age? The 8800GTX is now pretty old.
Well there is always the 8800Ultra. Just incase you wanted to spend $700 on a card.
 
Figures, I just looked, the PNY 8800GT is only 269 @ Newegg. But the raised prices are par for course.
I'd have to eat sleep and breath NewEgg to get it at $269.

Strangely enough CompUSA was offering the 8800GT for only $249 last week. I had already purchased my 3850 though.
 
After a rebate or two it'll be under $600.

rofl.gif
 
They're expecting the GeForce 9 series sometime in February-March so the next generation battle is about to begin again. At least ATI redeemed itself with the 3850 and 3870's which have sold really well and R700 for them is supposed to be quite impressive.

As for gaming resolutions, 1680 x 1050 is kind of the new standard, replacing 1280 x 1024 (though the majority still game at 1280 x 1024). I have a 30" Dell monitor and at 2560 x 1600, I can play everything maxed out (except for AA) in every game at that resolution except for: Call of Juarez and Crysis. Everything else runs fine easily with a single 8800 Ultra (overclocked though) and at that resolution, AA isn't a necessity. I can't wait for the next generation though, considering it might be twice to three times faster than the 8800Ultra is right now, which itself is 3-4 times faster than the X19xx series which ruled last generation...
 
January may be the magic month

They're expecting the GeForce 9 series sometime in February-March so the next generation battle is about to begin again.

January is end-of-line for the 8800 series cards. That probably means a new high-end card from nVidia in January or February.
Sometime early in 2008, ATI is bringing out a dual RV670 (HD3870 x2) card that will have a memory bandwidth of 115.2 GB/s.
Compare that to the memory bandwidth of 103.7 GB/s for the 8800 Ultra and 86.4 for the 8800 GTX.
Sometime early in 2008, nVidia will counter that with a dual G92 (8800 GT x2) video card.

I have a 30" Dell monitor and at 2560 x 1600, I can play everything maxed out (except for AA) in every game at that resolution except for: Call of Juarez and Crysis. Everything else runs fine easily with a single 8800 Ultra (overclocked though).

Some people on this forum think the X1900XT is a high-end card.
The next generation Mac Pro will not have a video card that can touch your 8800 Ultra.
One year later and the 8800 Ultra and 8800 GTX are still king of the video cards.
 
January is end-of-line for the 8800 series cards. That probably means a new high-end card from nVidia in January or February.
Sometime early in 2008, ATI is bringing out a dual RV670 (HD3870 x2) card that will have a memory bandwidth of 115.2 GB/s.
Compare that to the memory bandwidth of 103.7 GB/s for the 8800 Ultra and 86.4 for the 8800 GTX.
Sometime early in 2008, nVidia will counter that with a dual G92 (8800 GT x2) video card.



Some people on this forum think the X1900XT is a high-end card.
The next generation Mac Pro will not have a video card that can touch your 8800 Ultra.
One year later and the 8800 Ultra and 8800 GTX are still king of the video cards.

That bandwidth has to be shared between two chips (the RV670 X2). So comparing it against the 8800Ultra/GTX is misleading. SLI'ed 8800Ultra/GTX would walk all over the RV670 X2. Although I'd give AMD the kudos for CF scaling. It is remarkably better than SLI.
 
I agree but how many people can afford an 8800Ultra/GTX in SLI?
That G92 x2 will walk all over the RV670 x2.

Well in the context of a Mac Pro, I would think if you could afford a Mac Pro you could afford 8800Ultra SLi (assuming Apple offered it as a solution). What would be neat to see is if AMD can pull of a GX2 type board with the RV670 and then CF that. Boy that would have to be a monster setup. Quad CF or Quad SLi is just sick.
 
Well in the context of a Mac Pro, I would think if you could afford a Mac Pro you could afford 8800Ultra SLi (assuming Apple offered it as a solution). What would be neat to see is if AMD can pull of a GX2 type board with the RV670 and then CF that. Boy that would have to be a monster setup. Quad CF or Quad SLi is just sick.

It's called R680 and is basically two RV670 chips on one PCB with an CrossfireX connector. So yes, it should even be possible to connect four of them together come January or four regular RV670 boards.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.