Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

5050

macrumors regular
Original poster
May 28, 2009
180
2
Quick question for the GTX Titan pros out there. Is there a "heat issue" of stacking two GTX Titans in a Mac Pro 5,1? Or is special cooling required?

I have an external PSU, so no issue with supplying power to the 2 cards. Just curious if the two cards stacked on top of each other will have heat issues.

Thanks in advance!

Jonathan
 
Quick question for the GTX Titan pros out there. Is there a "heat issue" of stacking two GTX Titans in a Mac Pro 5,1? Or is special cooling required?

I have an external PSU, so no issue with supplying power to the 2 cards. Just curious if the two cards stacked on top of each other will have heat issues.

Thanks in advance!

Jonathan

This may only be slightly helpful but I have 2 cards with no heat issues. Albiet they are 2 significantly less powerful cards that probably produce less overall heat than 2 titans. They are a GTX660 and a GT120. I monitored temps with both cards installed and never noticed anything abnormal.

Mac Pro's usually stay pretty cool compared to many other computers out there. I say buy them from a place with a good return policy and test them both out.

If you have space, you could always install them in the opposite most PCI slots which might help a little bit also.
 
I installed two Titan Blacks on top of each other at the beginning of this week. Don't seem to have any heat issues, although OSX doesn't have the same GPU monitoring support that Windows has. I have the door off at the moment until I get my PCIe SSD relocated and can route the external PSU cables they the back, then I'll get a better idea of heat with the door on.

I tried to give them some breathing space by putting the second card in slot 3, but I wouldn't get a login screen when doing that. Only worked with slots 1 & 2.
 
The titans are designed to run in SLI on top of each other. No issues mate.
 
The titans are designed to run in SLI on top of each other. No issues mate.

While this is true, most PCs would not have them laying 1/2 cm above a warm metal surface. (i.e., no air flow underneath, just rising heat)

In a best case scenario, you try running them in Windows where temp monitoring works.

At the end of the day, they have a thermal shutoff. If they start getting too hot, they just turn off, typically UNABLE to damage themselves.
 
I ran dual titans with no heat issues. I even folded with them 24/7. I also had an internal power supply.

I tried that earlier this week w/ my FSP Booster and two Titan Blacks and rendering with both cards immediately reset the machine. Not sure why, but connecting an external 750W PC power supply fixed it so I assume it was a power issue.
 
Quick question for the GTX Titan pros out there. Is there a "heat issue" of stacking two GTX Titans in a Mac Pro 5,1? Or is special cooling required?

I have an external PSU, so no issue with supplying power to the 2 cards. Just curious if the two cards stacked on top of each other will have heat issues.

Thanks in advance!

Jonathan
Dual Titans Mac Pro definitely a great performer for CUDA.
But if you consider OpenCL performance, Dual HD7970 is much better.

You don't have to down clock or lower the volts like I did since you have external power.

https://forums.macrumors.com/threads/1732849/
 
While this is true, most PCs would not have them laying 1/2 cm above a warm metal surface. (i.e., no air flow underneath, just rising heat)

In a best case scenario, you try running them in Windows where temp monitoring works.

At the end of the day, they have a thermal shutoff. If they start getting too hot, they just turn off, typically UNABLE to damage themselves.

Auto shut off. Good point, hasn't considered that! Might be worth setting up the rig and see how it performs. If all else fails, can look into an external solution then.

Do you know what the minimum OS requirement is for the Titan Black?

I have one of the original Titans (GK110A), and would be happy to have another original Titan GK110A, but they are more difficult to find now. Do the Titan GK110B and Titan Black still have OpenCL compatibiltiy issues or has this been officially resolved?

----------

I tried that earlier this week w/ my FSP Booster and two Titan Blacks and rendering with both cards immediately reset the machine. Not sure why, but connecting an external 750W PC power supply fixed it so I assume it was a power issue.

What software are you using for GPU acceleration with the Titan Blacks?

Also, how are you routing your external PSU cables out from your Mac? I'm currently using an empty PCIe slot to run my cables out. If I add that extra Titan, I may have to run my system with the cover open (slot 4 occupied with Sonnet Tempo SSD Pro and slot 3 by Blackmagic Decklink Mini Monitor).
 
Do you know what the minimum OS requirement is for the Titan Black?

I have one of the original Titans (GK110A), and would be happy to have another original Titan GK110A, but they are more difficult to find now. Do the Titan GK110B and Titan Black still have OpenCL compatibiltiy issues or has this been officially resolved?
----------
What software are you using for GPU acceleration with the Titan Blacks?

Also, how are you routing your external PSU cables out from your Mac? I'm currently using an empty PCIe slot to run my cables out. If I add that extra Titan, I may have to run my system with the cover open (slot 4 occupied with Sonnet Tempo SSD Pro and slot 3 by Blackmagic Decklink Mini Monitor).
I'm using Octane Render. Unfortunately, I'm routing my cables in with the door off. Since I have a Velocity Solo PCIe SSD in slot 4, I have to decide how I want to continue. Either leave it as-is, or put the SSD back in a SATA bay and run the cables in throughout the open slot so I can close the door. I'm not sure yet.

I believe the GK110B chips require 10.9.2 or above with the latest web driver.
 
I'm using Octane Render. Unfortunately, I'm routing my cables in with the door off. Since I have a Velocity Solo PCIe SSD in slot 4, I have to decide how I want to continue. Either leave it as-is, or put the SSD back in a SATA bay and run the cables in throughout the open slot so I can close the door. I'm not sure yet.

I believe the GK110B chips require 10.9.2 or above with the latest web driver.

I'm stuck with the same dilemma. Too many PCIe slots in use. Here's my configuration:

Slot 5: Sonnet Tempo SSD Pro
Slot 4: Blackmagic Decklink Mini Monitor
Slot 3: Empty (power cables from Titan to external PSU)
Slot 2: GTX Titan
Slot 1: GTX Titan

I've considered a myriad of options, but it seems there's no way of having 2 internal GPUs without sacrificing other critical areas of system performance. Most likely a Cubix or Netstor external PCIe expansion chassis will be necessary. Space is at a premium with my setup though and was hoping to make use of a compact PCIe external chassis like Netstor's NA211A, but it just doesn't have the juice or bandwidth to maximize the Titan externally (1 PCIe slot will again be wasted running cables out and only 20 Gbps).

I was opposed to the design of the nMP for a while, but it seems like moving components out of the computer might not be a bad idea since I'm already doing so much of that already.

I wish there was a way to upgrade the internal USB to 3.0 and the 4 onboard SATA ports to 6 Gbps without PCIe cards! That would solve many problems!
 
I'm stuck with the same dilemma. Too many PCIe slots in use. Here's my configuration:

Slot 5: Sonnet Tempo SSD Pro
Slot 4: Blackmagic Decklink Mini Monitor
Slot 3: Empty (power cables from Titan to external PSU)
Slot 2: GTX Titan
Slot 1: GTX Titan

I've considered a myriad of options, but it seems there's no way of having 2 internal GPUs without sacrificing other critical areas of system performance. Most likely a Cubix or Netstor external PCIe expansion chassis will be necessary. Space is at a premium with my setup though and was hoping to make use of a compact PCIe external chassis like Netstor's NA211A, but it just doesn't have the juice or bandwidth to maximize the Titan externally (1 PCIe slot will again be wasted running cables out and only 20 Gbps).

I was opposed to the design of the nMP for a while, but it seems like moving components out of the computer might not be a bad idea since I'm already doing so much of that already.

I wish there was a way to upgrade the internal USB to 3.0 and the 4 onboard SATA ports to 6 Gbps without PCIe cards! That would solve many problems!
I know. It wouldn't solve everything (and might not even work for you) but instead of putting a second Titan in the Netstor, could you put the other two PCIe cards in there? They don't draw much power and don't seem to need the same bandwidth. The Tempo won't saturate 20Gbps and the DeckLink only needs a x4 link anyway, right? You'd still have to run with the door off, because the Netstor host card would fill up the top slot. And I'm not sure if the SSD can boot from the Netstor, hmm. Anyway, just an idea.

For me, I don't like the idea of running with the door off, but I don't have to look at it (it's under my desk and facing a divider. Plus the temps seems to be good and it's not loud. We'll see, I might just leave it this way for now.
 
I've been running 2 Titans since January without issue. I used them full-tilt in an Octane Team Render with my dual-780Ti z820 on cinematics for the new Transformers game. I never experienced any heat issues or anything odd.

I agree on the slot count dilemma. I ultimately have invested in another upgraded 4,1 as a very not-inexpensive solution. Good times.
 
I know. It wouldn't solve everything (and might not even work for you) but instead of putting a second Titan in the Netstor, could you put the other two PCIe cards in there? They don't draw much power and don't seem to need the same bandwidth. The Tempo won't saturate 20Gbps and the DeckLink only needs a x4 link anyway, right? You'd still have to run with the door off, because the Netstor host card would fill up the top slot. And I'm not sure if the SSD can boot from the Netstor, hmm. Anyway, just an idea.

For me, I don't like the idea of running with the door off, but I don't have to look at it (it's under my desk and facing a divider. Plus the temps seems to be good and it's not loud. We'll see, I might just leave it this way for now.

I was thinking the same as well. Trying to find a solution that keeps the cover closed is not easy! If only there was "1" more PCIe slot . . .

How about 2 x Sonnet Tempo SSD Pro PCIe cards with 4 SSDs (RAID 0) and the Decklink Mini Monitor? Enough juice and bandwidth to squeeze out of the Netstor NA211A for these 3 cards (NA211A = 20 Gbps, 250W)?

And no worries about booting with the Tempo, it's designated scratch disk.
 
I was thinking the same as well. Trying to find a solution that keeps the cover closed is not easy! If only there was "1" more PCIe slot . . .

How about 2 x Sonnet Tempo SSD Pro PCIe cards with 4 SSDs (RAID 0) and the Decklink Mini Monitor? Enough juice and bandwidth to squeeze out of the Netstor NA211A for these 3 cards (NA211A = 20 Gbps, 250W)?

And no worries about booting with the Tempo, it's designated scratch disk.
Hmm, both the Tempo and the DeckLink use 20Gbps PCIe 2.0 x4 cards. If you've got 4 x 6Gbps SSDs running in RAID 0, that's a theoretical 24Gbps, so in real-world bandwidths, that's likely saturation and a pretty good match. The question is how often will you use your DeckLink and scratch storage at the same time? I'm a 3D guy, not a video guy, so I don't know the exact workflow. I also don't know what kind of bandwidth your DeckLink needs. I'm guessing it depends on the footage you're ingesting/outputting. But if their usage doesn't overlap a ton, it might not be a real bottleneck. If they do, it might.

Power-wise you should be fine. All of them are bus powered, and even if they used all available PCIe power (which they won't) it would be 225W (75W * 3).

There's always the NA250A or 255A which has more slots and more bandwidth, but you're getting into a much larger investment. ~$2,000 unless you find one used.
 
Hmm, both the Tempo and the DeckLink use 20Gbps PCIe 2.0 x4 cards. If you've got 4 x 6Gbps SSDs running in RAID 0, that's a theoretical 24Gbps, so in real-world bandwidths, that's likely saturation and a pretty good match. The question is how often will you use your DeckLink and scratch storage at the same time? I'm a 3D guy, not a video guy, so I don't know the exact workflow. I also don't know what kind of bandwidth your DeckLink needs. I'm guessing it depends on the footage you're ingesting/outputting. But if their usage doesn't overlap a ton, it might not be a real bottleneck. If they do, it might.

Power-wise you should be fine. All of them are bus powered, and even if they used all available PCIe power (which they won't) it would be 225W (75W * 3).

There's always the NA250A or 255A which has more slots and more bandwidth, but you're getting into a much larger investment. ~$2,000 unless you find one used.

Thanks riggles.

SDI is 3 Gbps so it looks like the total for the 4 x SSDs (24 Gbps) with the Decklink Mini Monitor (3 Gbps) would theoretically be 27 Gbps. I wonder how the bandwidth gets parsed once you surpass the limit of 20 Gbps.

Regarding your numbers for 75W x 3, that would be based on PCIe 2.0 x4 = support up to 75 watt cards?

This might be transitioning into another thread discussion, but maybe the configuration for the 3 external PCIe slots could be:

Sonnet Tempo SSD Pro (2 x 6 Gbps = 12 Gbps)
Blackmagic Design Decklink Mini Monitor (3 Gbps)
eSATA/USB 3.0 (6 Gbps)

Comes in very close to 20 Gbps (21 Gbps if my math is right). That being said, any suggestions for a good eSATA/USB 3.0 card?
 
Thanks riggles.

SDI is 3 Gbps so it looks like the total for the 4 x SSDs (24 Gbps) with the Decklink Mini Monitor (3 Gbps) would theoretically be 27 Gbps. I wonder how the bandwidth gets parsed once you surpass the limit of 20 Gbps.
That I don't know. But I think it'd still be some silly fast storage speed.

Regarding your numbers for 75W x 3, that would be based on PCIe 2.0 x4 = support up to 75 watt cards?
It's just based on Apple saying the Mac Pro provides up to 75W to the PCIe slot itself. So if the cards can be powered inside the Mac Pro just fine, they're not using more than 75W.

This might be transitioning into another thread discussion, but maybe the configuration for the 3 external PCIe slots could be:

Sonnet Tempo SSD Pro (2 x 6 Gbps = 12 Gbps)
Blackmagic Design Decklink Mini Monitor (3 Gbps)
eSATA/USB 3.0 (6 Gbps)

Comes in very close to 20 Gbps (21 Gbps if my math is right). That being said, any suggestions for a good eSATA/USB 3.0 card?
I haven't gotten into eSATA/USB 3.0, haven't had the need for it. I've read some threads on it here a few months back, but can't keep track of which ones had which drawbacks and which seemed to be the best.
 
Hello everyone I'm new here.

Can I ask which external/internal power supply I need to buy in order to use a Titan Black (flashed) on mac?
I would like to have the newgg ones but there's no shipping to UK

Thank you!
 
Show's it still in stock :confused: Also what both? He provided a link to a single internal PSU...

And that is the same one I ran to power 2 Titans in my 4,1. Plenty of juice and 100% stable, never had any power issues running in my Mac Pro.
Oddly, my 450W FSP Booster could not handle my two Titan Blacks. Not even on startup. Had to resort to an external standard 750W PSU. Run the cables thru the top PCI opening so I can close the door.
 
I'm stuck with the same dilemma. Too many PCIe slots in use. Here's my configuration:

Slot 5: Sonnet Tempo SSD Pro
Slot 4: Blackmagic Decklink Mini Monitor
Slot 3: Empty (power cables from Titan to external PSU)
Slot 2: GTX Titan
Slot 1: GTX Titan

I've considered a myriad of options, but it seems there's no way of having 2 internal GPUs without sacrificing other critical areas of system performance. Most likely a Cubix or Netstor external PCIe expansion chassis will be necessary. Space is at a premium with my setup though and was hoping to make use of a compact PCIe external chassis like Netstor's NA211A, but it just doesn't have the juice or bandwidth to maximize the Titan externally (1 PCIe slot will again be wasted running cables out and only 20 Gbps).

I was opposed to the design of the nMP for a while, but it seems like moving components out of the computer might not be a bad idea since I'm already doing so much of that already.

I wish there was a way to upgrade the internal USB to 3.0 and the 4 onboard SATA ports to 6 Gbps without PCIe cards! That would solve many problems!
Hi can your cards work in sli ... ta
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.