Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jconly

macrumors member
Original poster
Jul 12, 2007
61
0
New York, NY
Hello everyone,

I'm currently running a current gen MP with an 8800 inside, as well as the original ATI card.

I'm wondering if when I connect my second monitor, I should connect it to the 8800 with the 24" Eizo, or plug it into the ATI seperately?

I'm doing photo work, and if I'm correct, LR2 can utilize the card's VRAM.
Seems to me, because of this I would be better off on two seperate cards, to allocate more VRAM to each monitor.

I'd appreciate thoughts and suggestions.

Thanks.
 
You probably won't see any performance benefit by using seperate cards.

If you plan on having application windows span across both displays having two graphics cards can givesyou a noticable performance hit. There are also the added fan noise and power consumption issues that you may want to consider.

Shouldn't be too hard to have a play around with it and see if you notice any differences though between the two setups though. If you do some testing please report back with your experience.
 
Hello everyone,

I'm currently running a current gen MP with an 8800 inside, as well as the original ATI card.

I'm wondering if when I connect my second monitor, I should connect it to the 8800 with the 24" Eizo, or plug it into the ATI seperately?

I'm doing photo work, and if I'm correct, LR2 can utilize the card's VRAM.
Seems to me, because of this I would be better off on two seperate cards, to allocate more VRAM to each monitor.

I'd appreciate thoughts and suggestions.

Thanks.

I'm with the OP on this one, if LR2 does indeed use onboard VRAM them utilizing both cards should boost performance. Whether it is noticeable or not is another matter. I guess if you're using very large RAW images then you will notice it once you start manipulating a lot of images.
 
Hello everyone,

I'm currently running a current gen MP with an 8800 inside, as well as the original ATI card.

I'm wondering if when I connect my second monitor, I should connect it to the 8800 with the 24" Eizo, or plug it into the ATI seperately?

I'm doing photo work, and if I'm correct, LR2 can utilize the card's VRAM.
Seems to me, because of this I would be better off on two seperate cards, to allocate more VRAM to each monitor.

I'd appreciate thoughts and suggestions.

Thanks.

With both cards installed, you can in theory drive upto four 30" Apple Displays. Since each DVI output can handle Dual-link and each card has 2 DVI out's. For core-image (non-opengl applications) the default ATI card would be faster!

I don't think it'll actually optimise the VRAM (there's nothing on Adobe's site regarding this, that I can see)
Probably it'll allow you to dedicate a bigger chunk of system ram to the application, similar to how Photoshop works.
 
With both cards installed, you can in theory drive upto four 30" Apple Displays. Since each DVI output can handle Dual-link and each card has 2 DVI out's. For core-image (non-opengl applications) the default ATI card would be faster!

Yes, but if you read the OP's question, the context is available VRAM.
 
Yes, but if you read the OP's question, the context is available VRAM.

OK.. Just took a closer look at my copy of LightRoom2 there are no user settings to optimise RAM of any sort. The enclosed attachment is the only thing that remotely related to RAM and it's a Camera Raw cache.
 

Attachments

  • Picture 14.png
    Picture 14.png
    79.8 KB · Views: 77
You are welcome to run some benchmarks, but connecting both monitors to the same screen should be faster.

Two cards might have more VRAM together, but one card cannot access the VRAM of the other card. So things that the OS stores in VRAM to make things faster might have to be stored in both VRAMs. For example, if you play games using lots of textures (that is one thing that uses lots of VRAM), these textures have to be copied to both cards, taking twice as much time.
 
OK.. Just took a closer look at my copy of LightRoom2 there are no user settings to optimise RAM of any sort. The enclosed attachment is the only thing that remotely related to RAM and it's a Camera Raw cache.

...and how does that preclude LR2's ability to use available VRAM?
 
...and how does that preclude LR2's ability to use available VRAM?

it doesn't.. though, in my opinion the OP should boost the overall system RAM to enable the application to run more smoothly, as you know previewing, opening and developing RAW files does take a lot of resources.

BTW...

Still can't find any references on Adobe's site where it says that LR2 optimised VRAM. However there's a discussion going on at flickr regarding LR2.
 
same card is fine. current cards have no issues supporting 2 screens so i don't see a reason for using 2 cards.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.