Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I just ordered this card with UK HE discount, I'm well chuffed with the price drop! I was looking around on overclockers.co.uk to see what the PC versions of the 8800GT's were selling for, the cheapest equivalent card was £113, so although I'm paying a bit more for the Apple version, at least it's not double the price as it potentially could have been at the old price of £220.
 
...As this puppy is running on PCI Express v1.0 compared with the new macpros, i'm just wondering what it'll do to the framerates wih the slower bus.

I believe it's PCIe 1.1. Also, whether it's 1.1 or 2.0 doesn't make a performance difference, right now. In the future we will see bandwidth approaching the limits of PCIe, but for now we are good-to-go. I think what makes the difference in performance between the new and old MPs is the faster processors, FSB, and RAM. Anyone care to confirm OR correct me here?

Itching for a forthcoming review from barefeats.com.... Get on it guys...

Yes, can't wait for reviews from multiple sources!
 
You could have bought a new Mac Pro any time this year, the 08 models do support the 8800 GT. I know I have one and it smokes.


And when you say "it smokes", I'm assuming you mean "it smokes" as compared to the standard 2600 card. What programs/tasks/apps. are you running/using to get it to "smoke" the 2600 card?
 
I believe it's PCIe 1.1. Also, whether it's 1.1 or 2.0 doesn't make a performance difference, right now. In the future we will see bandwidth approaching the limits of PCIe, but for now we are good-to-go. I think what makes the difference in performance between the new and old MPs is the faster processors, FSB, and RAM. Anyone care to confirm OR correct me here?

I think you'll find quite a few Crysis players that'll disagree with that statement.

For games based on the steam or unreal engines, yes i'd say we're probably going to be alright with 1.1.

Personally can't wait for UT3 to stop that irritating tearing effect which it does on my 1900xt.

M.
 
Response to MikeDTyke:
Are you telling me that you are running an application that needs more GPU bandwidth than 16-lanes at 250 MB/s per lane? That's 4 GB/s (that's Giga-BYTES) of pure GPU throughput and that's just PCIe 1.1. Last time I checked, SATA-II on MP doesn't even have that much bandwidth for all the drives combined! Trust me, the PCIe 1.1 bus is not the bottleneck.... CPU, FSB, RAM.
 
The 9800gtx is marginally better than the 8800gt. As I said, they both use the same core. Nvidia is 'lieing' by calling it the 9-series. Its not a new generation.

The sli card (9800gx2) is just a dual pcb card with a single slot - they did that with the 7950gx2, although it needed SLI.

So even though the 8800gt is old, its (unfortunately) still up to date. Its missing tri-sli and DX10.1.
None of the Nvidia cards support DX10.1

Wow!

I am quite surprise at the discomfort to many that this new option for us gen 1 MacPro have in the GPU arena.

I am quite content to get this, and I am glad that the Emails to the media and Apple have paid off to some of us who cared about a viable solution for running OSS and Pro Apps. I know gamer are happy as well, but to see some complain now that the card is out?

C'on, NVIDIA is doing a marketing hype to get the Windowz gamer to shell out more money for cards that all they do is over clock them and/or put two card wrapped in a technical package with marginal gains.

For me, I bought it yesterday as soon as I found it on the Apple Store. I can now have a stable system where I can have all my monitors running and again I can gain my productivity when I run my batches. The more content I create, the more viable my business is. And for that, in a year or two (my system is only months old) I will invest in Apple again with a greater degree of confidence that I could have had just 2 days ago!

To all that bashed our cause to get this card... and thought that we would never gain anything... well I never did understand the reason for your behavior nor the ones that now ask for the 9xxx instead. I see that many don't even own a MP:confused:

I hereby send a thank you to all that stood the punishment and at time humiliation of sitting on this... we are all well compensated. :D:D:D:D:D
I think it is more the feeling that Apple is an advanced computer maker. They should be using advanced parts. I mean they got Intel to give them the Xeons early why can't they get Nvidia (or AMD) to get the latest GPU's to them?

Response to MikeDTyke:
Are you telling me that you are running an application that needs more GPU bandwidth than 16-lanes at 250 MB/s per lane? That's 4 GB/s (that's Giga-BYTES) of pure GPU throughput and that's just PCIe 1.1. Last time I checked, SATA-II doesn't even have that much bandwidth. Trust me, the PCIe 1.1 bus is not the bottleneck.... CPU, FSB, RAM.
Hmm, you go from 40+GB/s (GPU mem to GPU) of bandwidth to 4GB/s (system mem to GPU) and that isn't a problem?
 
None of the Nvidia cards support DX10.1

I think it is more the feeling that Apple is an advanced computer maker. They should be using advanced parts. I mean they got Intel to give them the Xeons early why can't they get Nvidia (or AMD) to get the latest GPU's to them?

Yes, I agree with you. However, I think that Apple has not bought into the NVIDIA marketing strategy of providing the same GPU with several specs. As such, they select (albeit at a slow pace) those cards that best fits a performance level that is adequate for their market. We could say price/performance, but someone will we quick to point out that there are cheaper alternative at higher performance, but they forget the value here is the OS/App/HW integration that you get with Apple.....

Hey... we have the damn card... now LEAVE APPLE ALONE!!!! :D:D:D:D:D
 
How long until the firmware is ripped and posted on the net so we can buy equivalent cards from other OEMs and flash them? ;)

The cheapest 6800GT 512MB card I found on Newegg is $194.99. So although you pay $85 more for the Apple card, you still have to source one of the special power cables (which the last time I checked about a year ago were about $30) and go through the hassle of flashing and living without a warranty. Not really worth it really.
 
None of the Nvidia cards support DX10.1

It does support DX10.0 though, right? That's according the the nVidia site. I haven't kept up on the DX technology but what's the big difference between 10.0 and 10.1?

And since it is DX10 compatible I assume booting into Vista via Boot Camp allows for playing games in DX10, yes?
 
I don't know why Apple didn't just stick with BIOS in their computers, thus avoiding these problems in the first place.

Yes, it is an ancient piece of crap compared to EFI, but who gives a monkeys. The first thing the OS does is clear out all the BIOS crap anyway since it doesn't work in protected mode. So it is loaded for like a few seconds before it is nuked. Is all this expended effort really worth it for a few more boot options?
 
I don't know why Apple didn't just stick with BIOS in their computers, thus avoiding these problems in the first place.

Yes, it is an ancient piece of crap compared to EFI, but who gives a monkeys. The first thing the OS does is clear out all the BIOS crap anyway since it doesn't work in protected mode. So it is loaded for like a few seconds before it is nuked. Is all this expended effort really worth it for a few more boot options?

OHHHH but EFI looks sooooo much groovier during POST! :p
 
Hmm, nope. If you were actually using that bandwidth it would be a problem. But you're not, and won't for quite some time. Next....

You're not trying to tell us that all the textures are preloaded into the video ram on the card now are you? :rolleyes:

Anyway we're not talking about raw bandwidth of the bus, but the efficiency with which we can stream from system ram to the graphics card memory.

It's true that the system memory is the slowest link in the chain, but that doesn't directly translate to PCI Express 1.1 not affecting overall performance of the card. I'll wait to see what real world tests throw up rather than theoretical limits.

M.
 
Yes, I agree with you. However, I think that Apple has not bought into the NVIDIA marketing strategy of providing the same GPU with several specs. As such, they select (albeit at a slow pace) those cards that best fits a performance level that is adequate for their market. We could say price/performance, but someone will we quick to point out that there are cheaper alternative at higher performance, but they forget the value here is the OS/App/HW integration that you get with Apple.....

Hey... we have the damn card... now LEAVE APPLE ALONE!!!! :D:D:D:D:D

Even then Apple doesn't really take advantage of the true hardware capability that the GPU's present. So the OS/HW integration isn't complete. But yes, most should be happy that Apple did come through. Even if they claim that Nvidia didn't (which is an interesting question why didn't Nvidia come through).

You're not trying to tell us that all the textures are preloaded into the video ram on the card now are you? :rolleyes:

Anyway we're not talking about raw bandwidth of the bus, but the efficiency with which we can stream from system ram to the graphics card memory.

It's true that the system memory is the slowest link in the chain, but that doesn't directly translate to PCI Express 1.1 not affecting overall performance of the card. I'll wait to see what real world tests throw up rather than theoretical limits.

M.
Oh, you'd probably find that the new card really isn't dependent on system memory for much provided the FB is large enough. So yeah PCIe 1.1 versus 2.0 are only relevant if the 2.0 system uses the higher power allowance (which Apple doesn't).
 
You're not trying to tell us that all the textures are preloaded into the video ram on the card now are you?
Never said that. But how do those textures get to the GPU? Oh YEAH! The hard drive and optical drive store the files that ultimately supply those textures! Hard drives run wicked fast! What was I thinking! :rolleyes:

Anyway we're not talking about raw bandwidth of the bus, but the efficiency with which we can stream from system ram to the graphics card memory.
It's true that the system memory is the slowest link in the chain, but that doesn't directly translate to PCI Express 1.1 not affecting overall performance of the card. I'll wait to see what real world tests throw up rather than theoretical limits.

While I agree with this part, I'm not waiting. It can't be as bad as my 7300.
 
Hmm, nope. If you were actually using that bandwidth it would be a problem. But you're not, and won't for quite some time. Next...

EDIT: What's up with your fuzzy math? It's only double the bandwidth.

My math is fuzzy because I know the bandwidth capabilities of the 8800GT with the memory it has is higher than 40GB/s but I wasn't sure exactly what it is (I want to say 60GB/s).

Well if you aren't using the bandwidth why upgrade at all.
 
The cheapest 6800GT 512MB card I found on Newegg is $194.99. So although you pay $85 more for the Apple card, you still have to source one of the special power cables (which the last time I checked about a year ago were about $30) and go through the hassle of flashing and living without a warranty. Not really worth it really.

That's a good point. However, we all know how Apple is about lowering prices on tech as it grows more obsolete (they don't). So in two years when 8800GTs are dirt cheap, Apple will still be selling them at $279, and doing the flash thing will be the *only* sensible option.
 
Well if you aren't using the bandwidth why upgrade at all.

Probably because it has twice the VRAM as the 7300 and is a much more reliable product than the ATI. But if you or anyone else (MikeDTyke) are so concerned about this card running on PCI 1.1 instead of PCI 2.0, then just go out and buy a new Mac Pro and quit whining on this thread.
Seriously, Apple took a while, but they came through with what should be a great product. Some people are never happy. Sheesh. :rolleyes:
 
Never said that. But how do those textures get to the GPU? Oh YEAH! The hard drive and optical drive store the files that ultimately supply those textures! Hard drives run wicked fast! What was I thinking! :rolleyes:

While I agree with this part, I'm not waiting. It can't be as bad as my 7300.

Depends on the game, most of mine don't touch the drive unless loading a level, or they are preloaded before they are required. Typically all the textures are in system memory and only the most common ones and currently needed textures are loaded into GPU ram.

As to you not wanting to wait, it's a no brainer for you to upgrade, a 7300 must be a really painfull gaming experience.

I just want to know am i going to get the ~40% speed up over my 1900xt that the version in the 2008 macpro delivers, before i cough up the readies.

Oh and i'm not whining, that would imply that i think this card isn't good enough or Apple hasn't delivered. I'm not saying either. Just saying i want to be sure it's worth the readies over what i have already. Doesn't seem to unreasonable?

M. :D
 
As to you not wanting to wait, it's a no brainer for you to upgrade, a 7300 must be a really painfull gaming experience.
I just want to know am i going to get the ~40% speed up over my 1900xt that the version in the 2008 macpro delivers, before i cough up the readies.

M. :D

I hear you brother. If I had the 1900XT I would probably wait too. Keep an eye on barefeats.
 
Probably because it has twice the VRAM as the 7300 and is a much more reliable product than the ATI. But if you or anyone else (MikeDTyke) are so concerned about this card running on PCI 1.1 instead of PCI 2.0, then just go out and buy a new Mac Pro and quit whining on this thread.
Seriously, Apple took a while, but they came through with what should be a great product. Some people are never happy. Sheesh. :rolleyes:

But according to you system bandwidth isn't an issue so you can just pull the other 256MB from system memory and be no worse off??

The 512MB card isn't likely to pull from system ram in anything short of rendering work so I wouldn't worry about it.

My concern is why Apple couldn't get help from Nvidia. What could possibly be more important than helping Apple design, and spec out and build this great product?
 
But according to you system bandwidth isn't an issue so you can just pull the other 256MB from system memory and be no worse off??
Don't put words in my mouth. I NEVER said that. I said the GPU was not the bottleneck. Do you even know how a computer works? I said CPU, FSB, and RAM were the bottlenecks. In case you don't know, system bandwidth and FSB are synonymous. I'm done with this argument. Next...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.