Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is from the Apple support site:

"The Mac Pro (Early 2008) computers implement PCI-E revision 2.0 which support twice the data rate per lane as the PCI Express revision 1. Slots 1 and 2 are both x16 revision 2.0 slots. Slots 3 and 4 are both x4 revision 1 slots.

The number of lanes for each of these slots is permanently set for the Mac Pro (Early 2008). The Expansion Slot Utility doesn't apply."

So the expansion slot utility doesn't work on my mac, does that also mean the gtx has to physically be in the first slot to get the full benefit? I interpreted this to mean slots 1 and 2 could take x16 cards. I just assumed I was getting full power in Windows based on the results I got in Crysis, which seem to be dead-on with benchmarks I've read for this card...

Does the following apply to the Early 2008 Mac Pro as well? "Only one slot actually has 16 lanes going to it—you can put a x16 card into a slot that is only using one lane and it will operate properly, but it will operate more slowly than the same card in a x4, x8 or x16 slot." And what are the implications if the 8800 is disabled in Windows?
 
I'll be sure to let you know when I get it all working.

Can anyone shed any light on the PCIe x16 slot question above?
 
Hi again,
i think u have no slot problem, because u have the early 2008 version of the macpro. I got the 2007 macpro, a there u have a slot "problem".

Do u think the 295gtx can also be mounted in the first slot?
If it is possible and bootable, i would go and by myself this graficmonster.



I'll be sure to let you know when I get it all working.

Can anyone shed any light on the PCIe x16 slot question above?
 
Cool, thanks guys. I thought so but sometimes it's good to get corroboration from someone smarter than me. :)
 
... I thought it was interesting that the gtx shows up twice in the device manager. Here's another screenshot showing the gtx 295 specs in the appearance control panel.

Yeah the way the card "announces" itself to the computer is as two separate chips. ATI figured out how to make their X2 variants "announce" itself as one GPU.

That is probably why OS X has no idea what to do with it. It doesn't know what single card SLI is. ATI never exposes that.
 
SCENARIO A (aka "I'm a Mac"): OSX boots just fine every time with both cards powered and hooked up to the same monitor.

SCENARIO B (aka "I'm a PC") Booting Vista gets... random results:

1. Same setup as above, occasionally it works and defaults to the 8800 in slot 1. Usually the gtx is auto-disabled by Windows, claiming it's conflicting with a vga driver (?). Vista won't let me manually install the NEC driver on the "secondary" display, which also shows up as the same vga type you get when in safe mode. Or I might just get a blank screen.

2. 50-50 chance it will work with both cards powered, enabled and one physically connected to the monitor. Secondary display shows up as generic PNP monitor. This I could live with but it's not consistent.

3. The only truly consistent result in Vista is with only one card powered. Not bad, but not great.

4. Disabling the 8800 in slot 1 usually either fails to boot Vista or boots but fails to load the gtx adapter drivers, again as if it was in safe mode. I haven't tried the 8800 in PCIe slot 2 yet. I'm not comfortable with that setup because the 8800 blocks the fan from the gtx. This might solve the random booting issue though.

Question: Is it even possible to allow two cards to connect to the same monitor in Vista (Card 1: DVI-D port, Card 2: DVI-I port)? I can see why this wouldn't work.
 
News??

Hi,
have u make some steps forward,

Or still Problems with Windows?

Please keep us updated.
 
If I'd buy a GTX 295 and I'd switch card every time I wanna use windows, it would work fine, right? Is there anything special I need to do to make it work properly, like anything about connectors I need to know? I'd like to do some heavy gaming but I don't feel like having to sell my Mac Pro to get the nehalem ones coming out later this year with hopefully a better card, and at the same time I doubt Apple will be offering something as good as the GTX 295.
 
Good news: Vista 64 is now running smoothly with the gtx295 in slot 1 and the 8800 in slot 2. The 8800 is powered but the monitor cable is not connected. I only plug the 2nd cable in to boot OSX. Vista takes a very long time to boot (1-2 minutes) when power is connected to both cards. It doesn't seem to matter which slot the 8800 is in, that card is the default primary display every time. I have monitor 3 designated as the primary display in order for it to use the 295, and recognize my monitor's driver. OSX just works.

The gtx fan runs steadily but not loudly. It does seem warmer with the bigger card in slot 1, but the other way it was basically wasting a pci slot. Disabling the 8800 results in the drivers not loading for the 295.

Overall the gtx295 feels less stable than the 8800GT, mostly I think this is due to driver issues. The gtx285 OC or black edition might be a safer (and cheaper) bet to avoid dual-gpu hassles. However, since upgrading to 182.06 drivers I'm seeing some big improvements in stability over 181.22. As I've mentioned before, I'm getting a smooth 28-35 fps in Crysis Warhead at 1920x1200 with all settings at 'enthusiast', and 4xAA. However, in my heavily modded version of Oblivion I'm still getting a brutal 3fps in some areas and freezing and stuttering at the same settings. Other games I've tested (The Witcher, NWN2, AOC) are all running well on max settings.
 
Good news: Vista 64 is now running smoothly with the gtx295 in slot 1 and the 8800 in slot 2. The 8800 is powered but the monitor cable is not connected. I only plug the 2nd cable in to boot OSX. Vista takes a very long time to boot (1-2 minutes) when power is connected to both cards. It doesn't seem to matter which slot the 8800 is in, that card is the default primary display every time. I have monitor 3 designated as the primary display in order for it to use the 295, and recognize my monitor's driver. OSX just works.

The gtx fan runs steadily but not loudly. It does seem warmer with the bigger card in slot 1, but the other way it was basically wasting a pci slot. Disabling the 8800 results in the drivers not loading for the 295.

Overall the gtx295 feels less stable than the 8800GT, mostly I think this is due to driver issues. The gtx285 OC or black edition might be a safer (and cheaper) bet to avoid dual-gpu hassles. However, since upgrading to 182.06 drivers I'm seeing some big improvements in stability over 181.22. As I've mentioned before, I'm getting a smooth 28-35 fps in Crysis Warhead at 1920x1200 with all settings at 'enthusiast', and 4xAA. However, in my heavily modded version of Oblivion I'm still getting a brutal 3fps in some areas and freezing and stuttering at the same settings. Other games I've tested (The Witcher, NWN2, AOC) are all running well on max settings.

can you give us some explanation with pictures or videos about the hardware setup you made because it's really interesting & i want to try it out, but i don't understand how did you do it??
 
can you give us some explanation with pictures or videos about the hardware setup you made because it's really interesting & i want to try it out, but i don't understand how did you do it??

Sure, I've been meaning to do that, just busy lately. I should have time this week to take some pics.

After a week with this setup, I should probably clarify something I said earlier:

It's quieter than I expected for such a huge beast, and not running very hot at all. It isn't any louder in OSX than it is in Windows.

... this only applies to OSX. I downloaded GPU-Z to check the actual temp in Vista with the 295... Idle temperature is 51C, fan speed 41%. After 1 hour of Crysis Warhead, my temp hit 88C (!), and the fan was up to 79%. This is with the side panel off. After reading a bit more, that's not unusual for this card but that seems crazy hot to me. Previously, the 8800 rarely got hotter than 65C after running for several hours. (Next project: gtx 295 h2O?)

This is interesting too: at first I thought something was wrong with the clock reading, but the gtx seriously clocks itself down when idle, which is great. It goes down to 300Mhz (core) and 100 Mhz (memory).

There is no point in it, you can only have 8x (instead of 16x) speed on your GTX295.

I thought you could designate 1 slot to have 16 lanes in the 1st gen Mac Pro...
 
Sure, I've been meaning to do that, just busy lately. I should have time this week to take some pics.

After a week with this setup, I should probably clarify something I said earlier:



... this only applies to OSX. I downloaded GPU-Z to check the actual temp in Vista with the 295... Idle temperature is 51C, fan speed 41%. After 1 hour of Crysis Warhead, my temp hit 88C (!), and the fan was up to 79%. This is with the side panel off. After reading a bit more, that's not unusual for this card but that seems crazy hot to me. Previously, the 8800 rarely got hotter than 65C after running for several hours. (Next project: gtx 295 h2O?)

This is interesting too: at first I thought something was wrong with the clock reading, but the gtx seriously clocks itself down when idle, which is great. It goes down to 300Mhz (core) and 100 Mhz (memory).



I thought you could designate 1 slot to have 16 lanes in the 1st gen Mac Pro...

tnx and i'll be waiting until your free to tell us the way you did it & also i hope these temperature numbers gets lower in some way
good luck :rolleyes:
 
... this only applies to OSX. I downloaded GPU-Z to check the actual temp in Vista with the 295... Idle temperature is 51C, fan speed 41%. After 1 hour of Crysis Warhead, my temp hit 88C (!), and the fan was up to 79%. This is with the side panel off. After reading a bit more, that's not unusual for this card but that seems crazy hot to me. Previously, the 8800 rarely got hotter than 65C after running for several hours. (Next project: gtx 295 h2O?)

This is interesting too: at first I thought something was wrong with the clock reading, but the gtx seriously clocks itself down when idle, which is great. It goes down to 300Mhz (core) and 100 Mhz (memory).

88C isn't bad for two GTX260's ;)

As long as the temp doesn't hit 100C the card is fine. You could try a different cooler, but I don't think many of the aftermarket coolers for single slot SLI cards still vent out the back of the case.

Look here for an example of an aftermarket cooler. As sweet as it looks you still gotta move air past the fins and you lose space around the card...
 
Not right

Thats not right, in the first MacPro (2007) there is a 16x PCI-E slot.

The Early2008 MacPro has two of them.

No one with a 2007 MacPro in this forum who is interested in a 295 GTX??

There is no point in it, you can only have 8x (instead of 16x) speed on your GTX295.
 
Thats not right, in the first MacPro (2007) there is a 16x PCI-E slot.

The Early2008 MacPro has two of them.

No one with a 2007 MacPro in this forum who is interested in a 295 GTX??
There's also a utility that allows you to adjust the lane configuration in the '06 & '07 models.

Apple's changes to the '08 included a second PCIe 16x, but the lane configuration is fixed. It's caused problems for those who want, or have installed 8x lane RAID cards. (Mini SAS cables won't reach slot 2, as the custom power cable is too short).
 
Just bought the same card...need advice

Sure, I've been meaning to do that, just busy lately. I should have time this week to take some pics.


Hi Punkmofo,

can you give us some information regarding your working setup. I have exactly the same configuration as you 'Early 2008 Mac Pro 2.8 with Nvidia 8800 pre-installed. On bootcamp I have Vista 64 just like you too.

I got the card and I am trying to figure out the power scheme problem. I already have one PCIe power cord and from what I have read so far I will need to get another one supplied from the ATI website (at least for here in Canada).

Did you use the supplied 2 x 6 pin to 8 pin adapter that came with the card or did you get a third party PCIe 6 pin to 8 pin adapter cable instead to connect the 8 pin pcie connector on the GTX295?

or did you only use the one PCIe cable that came with your 8800 to power the 295 card?

For the 8800 from what I understand is that you used the optical bay Molex plug to power your 8800.

Any info would be very much appreciated,
Thanks!

Maidensblush.
 
In order to power the 8800gt using the molex from the optical drive, do you need a special cable or adapter to go from a molex 4 pin to a 6 or 8 pin?
 
In order to power the 8800gt using the molex from the optical drive, do you need a special cable or adapter to go from a molex 4 pin to a 6 or 8 pin?
Here's the cable (6 pin PCIe power) from the logic board to the card.

You might be able to get away with the 4 pin Molex to 6 pin PCIe version, given the load isn't too high.
 
Here's the cable (6 pin PCIe power) from the logic board to the card.

You might be able to get away with the 4 pin Molex to 6 pin PCIe version, given the load isn't too high.

Hey I was reading your posts on some other guys thread regarding the same power issue and I was a little confused with what you said. You told the guy to use 2 ati cables and a 6 to 8 pin adapter for the other 6 pin cable. That is three cables but you did not state the location of third power source. I understand the two coming from the logicboard but where does the third cable's power source come from?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.