Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

seabass069

macrumors regular
Original poster
Oct 5, 2005
226
0
I just bought a new LCD monitor. Is it better to use the digital cable or the analog cable? I plugged in the digital cable and I do not get a signal until Windows boots up. (My Mac Pro is on order, I found a pretty good deal on the monitor, so I am testing it on my PC), my concern is that I do not see the boot screen, where I can get into the BIOS. The monitor is not detected until Windows boots up. Is there a BIOS setting that will detect the digital monitor?
 
I just bought a new LCD monitor. Is it better to use the digital cable or the analog cable? I plugged in the digital cable and I do not get a signal until Windows boots up. (My Mac Pro is on order, I found a pretty good deal on the monitor, so I am testing it on my PC), my concern is that I do not see the boot screen, where I can get into the BIOS. The monitor is not detected until Windows boots up. Is there a BIOS setting that will detect the digital monitor?


did you install all the appropriate drivers for the monitor?
 
That's weird... I haven't seen that before. Do you *have* an analog cable? If you do, can't you just plug it in whenever you need BIOS? Shouldn't be that often, right? Outside of this issue (which I don't think will be an issue with the Mac Pro), digital is generally way better, because you won't get any weird fuzzies or distortion from the re-conversion at the monitor.
 
did you install all the appropriate drivers for the monitor?
No drivers should be required at the BIOS level. (No OS is running).

I just switched from VGA to DVI on my PC, and it only started working well when I removed the VGA cable entirely. Now the BIOS screens come out the DVI and the monitor selects the DVI input automatically.

FWIW the image does seem a bit sharper than with the VGA.

B
 
Do you think I need to install the analog cable, let the BIOS see that there is a monitor, then shut off the PC disconnect the analog cable and install the digital cable?
 
Do you think I need to install the analog cable, let the BIOS see that there is a monitor, then shut off the PC disconnect the analog cable and install the digital cable?

I don't understand what you are trying to accomplish. Right now, the monitor comes on and works normally under DVI, once Windows boots, right? So what is the issue? Do you *need* to get into BIOS right now? Or are you worried that, in the absence of a BIOS detection of the monitor, something will not work? Does the monitor work normally in Windows now? Is there any reason you can't leave good enough alone? :D
 
Well I like to see the boot up screen. Plus, I have WINXP PRO and WIN2000 on a dual boot, (2000 is for work purposes). The monitor does not come out of sleep mode until the Windows screen comes on. I can't access my dual boot selections. I called the manufacturer and they said it could possibly be a setting on the video card. It has nothing to do with the monitor. My card is an ATI x700 pcie.
 
Ahhh, now I'm up to speed!

I bet there's some subtlety about how it handles DVI outputs... actually, I did a quick Google. This thread is long on complaints and short on solutions, but....

http://www.sapphiretech.com/en/forums/showthread.php?t=383&page=1&pp=10

It appears that this issue has to do with a squabble over how to implement VESA standards. The BIOS doesn't really have "video drivers" per se. Rather, what it does is outputs a standards-based set of requests to the video card, that is supposed to result in the video card providing the traditional text screen display that BIOS uses... But in order for this to work properly, everything in the system has to be on the same page about how to respond to this standard request. It seems that in your case, one or more of your pieces of hardware are not compliant....

I'm not even sure you can do this for bootcamp, let alone Windows, but I know that in OS X, there is a Sys Prefs pane that lets you select the device / partition from which you wish to boot on the next boot. So you can select this and then restart and you get a boot in your desired OS without needing to intervene at boot time. Is this possible in Windows?
 
I am going to look in that. Thanks for that info. I am also going to see if my BIOS needs to be upgraded.
 
I'm not even sure you can do this for bootcamp, let alone Windows, but I know that in OS X, there is a Sys Prefs pane that lets you select the device / partition from which you wish to boot on the next boot. So you can select this and then restart and you get a boot in your desired OS without needing to intervene at boot time. Is this possible in Windows?

It does work for Boot Camp, but there's no real equivalent for WIndows. The info of which OS to boot is in the BOOT.INI file, so I guess you could edit that on the fly... http://support.microsoft.com/kb/289022

Have you checked for any firmware updates for your video card? It might be that DVI out is only enabled after boot.

B
 
I have upgraded my motherboard's BIOS. I reinstalled the video card drivers and still the the monitor is not being detected by the BIOS. I tried to find out to upgrade my video cards firmware, but know luck. Tech support at AMD is not there on the weekend. I had no problem talking with someone from Viewsonic. I am going to call AMD on Monday. I know my video card is old now, but still should not affect it being detected before Windows starts.
 
The tech ATI wrote me and said that if I want to get into my BIOS, I need to install the VGA cable or use a DVI-VGA connector. I think ATI knows about this problem and is not saying anything. I just can't believe that the video card only allows a VGA setup to see the BIOS. I have read many forum discussion related to this issue. I would think there would be a video card BIOS flash to address this issue. Does anyone have an explanation why a DVI setup is not detected until Windows fully boots? This is so frustrating.
 
Crappy video card implementation?

I just finally bought a DVI cable for my Dell Dimension 4600 and did not encounter this problem with either the original Nvidia card I removed or the ATI X1300 I replaced it with. I see the BIOS screen over DVI just as I did over VGA.

:confused:

B
 
I ended up buying an ATI x1900 GT card. The DVI monitor was completely detected during boot up. My last card ATI x700 was two years old. I think either the card was just bad or the technology had not caught up with that card. Thanks for everyones reply. This new card rocks.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.