Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Dixie Flatline

macrumors newbie
Original poster
Aug 9, 2010
24
4
I just upgraded recently from a mid-2010 27" iMac to the 5K Retina iMac with M295X, and I've noticed a peculiar little change in behavior. I have a second monitor (HP ZR24w, connected via mini-DP to DP cable), which I used with the old iMac, and now with the new one.

I had a Win7 Boot Camp install on the 2010 iMac, and since I mostly use Windows for games, I disabled the second monitor in Windows (i.e., set it to display desktop only on monitor 1, the built-in display). I did that once, and it never changed. I've got Windows 8.1 Boot Camp on the Retina iMac, and I've tried to do the same thing as I did on the old iMac, but Windows keeps re-enabling the second monitor on boot, and I have to disable it again each time.

The funny thing I've discovered is this: if I reboot from Yosemite into Windows 8.1, the monitor is re-enabled in Windows. If I reboot from Windows into Windows (rather than going back to Yosemite), the monitor remains disabled until I do go back to Yosemite, and then it's enabled again on the next Windows boot. (I do have Windows fast startup disabled, in case it matters.) It's happened with both the December 2014 AMD Boot Camp drivers, and the newly released drivers from a few days ago.

Has anyone seen this happen, or have a good explanation for why it does? I'm wondering if it's something to do with the Thunderbolt controller re-detecting the monitor in OSX, and somehow presenting it to Windows as new hardware.

Fortunately, it's not a big deal to disable the second monitor since I discovered the Cmd-P hotkey combo to toggle it off, but it's still a little strange and I'm curious about why it happens.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.