Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

robertdsc

macrumors regular
Original poster
Jan 28, 2014
202
9
Right now, I currently use an Apple Cinema HD ADC display with my Mac Pro 1,1. I have it connected via the ADC to DVI converter and everything works fine.

On the second port on the Mac Pro's video card, I currently have an Acer 23 inch monitor connected via DVI. This monitor was purchased used last year and can only be turned off via a switch on the power strip it is connected to. So far, this setup works fine and I can do my work when two monitors are needed, then switch it off when only one is necessary.

My question is this: if I were to pull a second Cinema HD monitor out of storage (it has no back leg; I used a guitar stand when I had it in operation) and put that in place of the Acer monitor, then connected to the Mac via ADC to DVI converter, would I run into problems connecting the converter's power cable to the power strip and turning it on and off via the power strip?

This Acer monitor was a lucky find and I know it's not perfect, so the setup I have to turn it on and off is not a problem. But the Apple monitors are different and if I do the wiring right, will it damage the monitor?
 
Do you have two ADC adapters? If so, I don't see what the problem is. All they are are psus with a USB passthrough. You can switch off the ACD with the touch controls on the base of the bezel.
 
I guess I didn't frame my question right.

Would the Cinema Display eventually suffer damage if I turned it off via the switch on the power strip its ADC to DVI converter is connected to?

I can turn on and off the Acer monitor without any issues, indeed, that's the only way I can actually get the monitor to turn off since the monitor was bought used and already in that condition.
 
Why would you do so? The ACDs have a physical on/off switch anyway. I am not sure what you are trying to achieve here.
 
Why would you do so? The ACDs have a physical on/off switch anyway. I am not sure what you are trying to achieve here.

My understanding of ADC displays is that the power button only has two modes. Either disabled or as a substitute for the Mac power switch. Press it and the Mac sleeps not just the display. I have three ADC displays and have yet to figure out a way to shut them off directly, let alone turn them on when there is no video signal.

Or am I misunderstanding?
 
Anyway, to answer OP's original question: At least if the ACD is connected directly to the ADC port on the video card, pulling the power whilst on can kill the ACD.
 
Well, today I did it.

I pulled the Cinema HD Display out of storage and plopped it on the desk, installed the guitar stand, then plugged everything in. The ADC to DVI adapter's power cable went into the power strip, the monitor's ADC cable went into the adapter, and the DVI cable went into the Mac Pro.

I started up the Mac with the second HD Display turned off. After logging in on the main HD display, I flipped the switch on the power strip. The main display blinked, then the second display came online.

In Snow Leopard, I've found that when adjusting monitors, the Mac "resets" itself to the default wallpaper I select before going back to the random picture folder I select in System Preferences. The second HD Display did this when I turned it off and on at the power strip.

While I dislike the amount of real estate these monitors take up on my desk, it is a dream to finally have two Cinema HD monitors to use at the same time.

My question now is a little different. Now that I have this setup and verified that it works, would it be possible to run this setup on a G5 that has a single 128 MB video card? This Mac Pro has a NVIDIA GeForce 8800 GT with 512 MB.
 
Yes, you should be able to use a single 128MB video card with two 23" ADC displays. Their minimum requirement is 64MB of video memory. Because of this, their performance may suffer a bit, but they will work.
 
Yes, you should be able to use a single 128MB video card with two 23" ADC displays. Their minimum requirement is 64MB of video memory. Because of this, their performance may suffer a bit, but they will work.

Intell,

You seem like the person to answer this, and it's something I've been wondering about for a while.

Does the amount of VRAM primarily dictate the maximum number of pixels that a card can drive? From looking at our ongoing benchmark thread, I'm guessing that's the case as it seems as though otherwise identical cards with different amounts of VRAM perform the same.

I know, for example, that the primarily difference between the GEForce 2MX and the 2MX Twin-View is that the Twin-View has 64mb of VRAM compared to the 32mb on the standard.

Also, looking at a more advanced card in the Radeon 9600, I know that the XT has 128mb and the Pro has 256mb. The XT is supposedly the higher performing card(although several of us have found that they benchmark the same in a 4x AGP slot), but the Pro can drive a 30" Cinema where the XT can not.
 
There is a limit. For most PowerPC cards it is a artificial one hard coded into their ROMs. When a card lacks enough memory, such as an ATI Rage 128 with 16MB, it cannot drive a large display at its full resolution. It will instead drive it at the highest resolution that card supports, often with great performance hits. The GeForce 2 MX/Twinview differences is all in the ROM. There is a ROM that you can flash to a regular 2 MX that will allow it to drive two displays. For the 30" display, a dual link DVI port is required. Only some of the ATI 9600/9650 cards have one, despite having enough memory.
 
Taking Apple imposed restrictions out of it .... you basically only need 16MB of frame buffer for anything this side of a retina display.

8MB of VRAM is enough for a 1920x1200 24 bit colour frame buffer with memory to spare :
1920*1200*24/(8*1048576) = 6.59MB

You'd need 16MB for 32 bit colour :
1920*1200*32/(8*1048576) = 8.79MB

Quartz Extreme will use additional VRAM to improve performance - it'll likely use double buffering and texture maps where it can for compositing, then fall back to system memory.
 
An update:

After a few days of having the two acrylic monitors on my desktop, I decided to scrap the setup and drop the Acer back in.

Every time I would power on the second Cinema HD screen, the wallpapers would reset briefly on both screens, then resume displaying their normal rotations. With the Acer monitor plugged in, that reset does not happen.

I also noticed that the main Cinema HD monitor would leave ghost images on the screen when the reset would occur and would also leave ghost images on shutdown. Again, with the Acer plugged in, that does not happen. Strange.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.